Feb 16 16:13:26 localhost kernel: Linux version 5.14.0-677.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Feb 6 13:57:07 UTC 2026
Feb 16 16:13:26 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 16 16:13:26 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64 root=UUID=19ee07ed-c14b-4aa3-804d-f2cbdae2694f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 16 16:13:26 localhost kernel: BIOS-provided physical RAM map:
Feb 16 16:13:26 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 16 16:13:26 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 16 16:13:26 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 16 16:13:26 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 16 16:13:26 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 16 16:13:26 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 16 16:13:26 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 16 16:13:26 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Feb 16 16:13:26 localhost kernel: NX (Execute Disable) protection: active
Feb 16 16:13:26 localhost kernel: APIC: Static calls initialized
Feb 16 16:13:26 localhost kernel: SMBIOS 2.8 present.
Feb 16 16:13:26 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 16 16:13:26 localhost kernel: Hypervisor detected: KVM
Feb 16 16:13:26 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 16 16:13:26 localhost kernel: kvm-clock: using sched offset of 7976026459 cycles
Feb 16 16:13:26 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 16 16:13:26 localhost kernel: tsc: Detected 2800.000 MHz processor
Feb 16 16:13:26 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 16 16:13:26 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 16 16:13:26 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Feb 16 16:13:26 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Feb 16 16:13:26 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 16 16:13:26 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 16 16:13:26 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 16 16:13:26 localhost kernel: Using GB pages for direct mapping
Feb 16 16:13:26 localhost kernel: RAMDISK: [mem 0x1b6e4000-0x29b69fff]
Feb 16 16:13:26 localhost kernel: ACPI: Early table checksum verification disabled
Feb 16 16:13:26 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 16 16:13:26 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 16:13:26 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 16:13:26 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 16:13:26 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 16 16:13:26 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 16:13:26 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 16 16:13:26 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 16 16:13:26 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 16 16:13:26 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 16 16:13:26 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 16 16:13:26 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 16 16:13:26 localhost kernel: No NUMA configuration found
Feb 16 16:13:26 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Feb 16 16:13:26 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Feb 16 16:13:26 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Feb 16 16:13:26 localhost kernel: Zone ranges:
Feb 16 16:13:26 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 16 16:13:26 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 16 16:13:26 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Feb 16 16:13:26 localhost kernel:   Device   empty
Feb 16 16:13:26 localhost kernel: Movable zone start for each node
Feb 16 16:13:26 localhost kernel: Early memory node ranges
Feb 16 16:13:26 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 16 16:13:26 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 16 16:13:26 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Feb 16 16:13:26 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Feb 16 16:13:26 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 16 16:13:26 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 16 16:13:26 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 16 16:13:26 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 16 16:13:26 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 16 16:13:26 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 16 16:13:26 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 16 16:13:26 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 16 16:13:26 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 16 16:13:26 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 16 16:13:26 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 16 16:13:26 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 16 16:13:26 localhost kernel: TSC deadline timer available
Feb 16 16:13:26 localhost kernel: CPU topo: Max. logical packages:   8
Feb 16 16:13:26 localhost kernel: CPU topo: Max. logical dies:       8
Feb 16 16:13:26 localhost kernel: CPU topo: Max. dies per package:   1
Feb 16 16:13:26 localhost kernel: CPU topo: Max. threads per core:   1
Feb 16 16:13:26 localhost kernel: CPU topo: Num. cores per package:     1
Feb 16 16:13:26 localhost kernel: CPU topo: Num. threads per package:   1
Feb 16 16:13:26 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Feb 16 16:13:26 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Feb 16 16:13:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 16 16:13:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 16 16:13:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 16 16:13:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 16 16:13:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 16 16:13:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 16 16:13:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 16 16:13:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 16 16:13:26 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 16 16:13:26 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 16 16:13:26 localhost kernel: Booting paravirtualized kernel on KVM
Feb 16 16:13:26 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 16 16:13:26 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 16 16:13:26 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Feb 16 16:13:26 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Feb 16 16:13:26 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 16 16:13:26 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 16 16:13:26 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64 root=UUID=19ee07ed-c14b-4aa3-804d-f2cbdae2694f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 16 16:13:26 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64", will be passed to user space.
Feb 16 16:13:26 localhost kernel: random: crng init done
Feb 16 16:13:26 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 16 16:13:26 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Feb 16 16:13:26 localhost kernel: Fallback order for Node 0: 0 
Feb 16 16:13:26 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Feb 16 16:13:26 localhost kernel: Policy zone: Normal
Feb 16 16:13:26 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 16 16:13:26 localhost kernel: software IO TLB: area num 8.
Feb 16 16:13:26 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 16 16:13:26 localhost kernel: ftrace: allocating 49543 entries in 194 pages
Feb 16 16:13:26 localhost kernel: ftrace: allocated 194 pages with 3 groups
Feb 16 16:13:26 localhost kernel: Dynamic Preempt: voluntary
Feb 16 16:13:26 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 16 16:13:26 localhost kernel: rcu:         RCU event tracing is enabled.
Feb 16 16:13:26 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 16 16:13:26 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 16 16:13:26 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 16 16:13:26 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 16 16:13:26 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 16 16:13:26 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 16 16:13:26 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 16 16:13:26 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 16 16:13:26 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 16 16:13:26 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 16 16:13:26 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 16 16:13:26 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 16 16:13:26 localhost kernel: Console: colour VGA+ 80x25
Feb 16 16:13:26 localhost kernel: printk: console [ttyS0] enabled
Feb 16 16:13:26 localhost kernel: ACPI: Core revision 20230331
Feb 16 16:13:26 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 16 16:13:26 localhost kernel: x2apic enabled
Feb 16 16:13:26 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Feb 16 16:13:26 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 16 16:13:26 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Feb 16 16:13:26 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 16 16:13:26 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 16 16:13:26 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 16 16:13:26 localhost kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Feb 16 16:13:26 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 16 16:13:26 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 16 16:13:26 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 16 16:13:26 localhost kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Feb 16 16:13:26 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 16 16:13:26 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Feb 16 16:13:26 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 16 16:13:26 localhost kernel: active return thunk: retbleed_return_thunk
Feb 16 16:13:26 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 16 16:13:26 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 16 16:13:26 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 16 16:13:26 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 16 16:13:26 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 16 16:13:26 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Feb 16 16:13:26 localhost kernel: Freeing SMP alternatives memory: 40K
Feb 16 16:13:26 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 16 16:13:26 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Feb 16 16:13:26 localhost kernel: landlock: Up and running.
Feb 16 16:13:26 localhost kernel: Yama: becoming mindful.
Feb 16 16:13:26 localhost kernel: SELinux:  Initializing.
Feb 16 16:13:26 localhost kernel: LSM support for eBPF active
Feb 16 16:13:26 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 16 16:13:26 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 16 16:13:26 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 16 16:13:26 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 16 16:13:26 localhost kernel: ... version:                0
Feb 16 16:13:26 localhost kernel: ... bit width:              48
Feb 16 16:13:26 localhost kernel: ... generic registers:      6
Feb 16 16:13:26 localhost kernel: ... value mask:             0000ffffffffffff
Feb 16 16:13:26 localhost kernel: ... max period:             00007fffffffffff
Feb 16 16:13:26 localhost kernel: ... fixed-purpose events:   0
Feb 16 16:13:26 localhost kernel: ... event mask:             000000000000003f
Feb 16 16:13:26 localhost kernel: signal: max sigframe size: 1776
Feb 16 16:13:26 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 16 16:13:26 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 16 16:13:26 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 16 16:13:26 localhost kernel: smpboot: x86: Booting SMP configuration:
Feb 16 16:13:26 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 16 16:13:26 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 16 16:13:26 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Feb 16 16:13:26 localhost kernel: node 0 deferred pages initialised in 12ms
Feb 16 16:13:26 localhost kernel: Memory: 7617740K/8388068K available (16384K kernel code, 5795K rwdata, 13944K rodata, 4204K init, 7180K bss, 764408K reserved, 0K cma-reserved)
Feb 16 16:13:26 localhost kernel: devtmpfs: initialized
Feb 16 16:13:26 localhost kernel: x86/mm: Memory block size: 128MB
Feb 16 16:13:26 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 16 16:13:26 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Feb 16 16:13:26 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 16 16:13:26 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 16 16:13:26 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Feb 16 16:13:26 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 16 16:13:26 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 16 16:13:26 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 16 16:13:26 localhost kernel: audit: type=2000 audit(1771258404.397:1): state=initialized audit_enabled=0 res=1
Feb 16 16:13:26 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 16 16:13:26 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 16 16:13:26 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 16 16:13:26 localhost kernel: cpuidle: using governor menu
Feb 16 16:13:26 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 16 16:13:26 localhost kernel: PCI: Using configuration type 1 for base access
Feb 16 16:13:26 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 16 16:13:26 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 16 16:13:26 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Feb 16 16:13:26 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Feb 16 16:13:26 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Feb 16 16:13:26 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Feb 16 16:13:26 localhost kernel: Demotion targets for Node 0: null
Feb 16 16:13:26 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 16 16:13:26 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 16 16:13:26 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 16 16:13:26 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 16 16:13:26 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 16 16:13:26 localhost kernel: ACPI: Interpreter enabled
Feb 16 16:13:26 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 16 16:13:26 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 16 16:13:26 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 16 16:13:26 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 16 16:13:26 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 16 16:13:26 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 16 16:13:26 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [3] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [4] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [5] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [6] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [7] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [8] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [9] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [10] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [11] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [12] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [13] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [14] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [15] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [16] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [17] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [18] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [19] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [20] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [21] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [22] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [23] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [24] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [25] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [26] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [27] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [28] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [29] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [30] registered
Feb 16 16:13:26 localhost kernel: acpiphp: Slot [31] registered
Feb 16 16:13:26 localhost kernel: PCI host bridge to bus 0000:00
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 16 16:13:26 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 16 16:13:26 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Feb 16 16:13:26 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Feb 16 16:13:26 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 16 16:13:26 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Feb 16 16:13:26 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Feb 16 16:13:26 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 16 16:13:26 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 16 16:13:26 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Feb 16 16:13:26 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Feb 16 16:13:26 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 16 16:13:26 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Feb 16 16:13:26 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Feb 16 16:13:26 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Feb 16 16:13:26 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Feb 16 16:13:26 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 16 16:13:26 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Feb 16 16:13:26 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Feb 16 16:13:26 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 16 16:13:26 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Feb 16 16:13:26 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Feb 16 16:13:26 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 16 16:13:26 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 16 16:13:26 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 16 16:13:26 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 16 16:13:26 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 16 16:13:26 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 16 16:13:26 localhost kernel: iommu: Default domain type: Translated
Feb 16 16:13:26 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Feb 16 16:13:26 localhost kernel: SCSI subsystem initialized
Feb 16 16:13:26 localhost kernel: ACPI: bus type USB registered
Feb 16 16:13:26 localhost kernel: usbcore: registered new interface driver usbfs
Feb 16 16:13:26 localhost kernel: usbcore: registered new interface driver hub
Feb 16 16:13:26 localhost kernel: usbcore: registered new device driver usb
Feb 16 16:13:26 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 16 16:13:26 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 16 16:13:26 localhost kernel: PTP clock support registered
Feb 16 16:13:26 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 16 16:13:26 localhost kernel: NetLabel: Initializing
Feb 16 16:13:26 localhost kernel: NetLabel:  domain hash size = 128
Feb 16 16:13:26 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 16 16:13:26 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 16 16:13:26 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 16 16:13:26 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 16 16:13:26 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 16 16:13:26 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 16 16:13:26 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 16 16:13:26 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 16 16:13:26 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 16 16:13:26 localhost kernel: vgaarb: loaded
Feb 16 16:13:26 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 16 16:13:26 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 16 16:13:26 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 16 16:13:26 localhost kernel: pnp: PnP ACPI init
Feb 16 16:13:26 localhost kernel: pnp 00:03: [dma 2]
Feb 16 16:13:26 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 16 16:13:26 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 16 16:13:26 localhost kernel: NET: Registered PF_INET protocol family
Feb 16 16:13:26 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 16 16:13:26 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Feb 16 16:13:26 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 16 16:13:26 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Feb 16 16:13:26 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 16 16:13:26 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Feb 16 16:13:26 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Feb 16 16:13:26 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 16 16:13:26 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 16 16:13:26 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 16 16:13:26 localhost kernel: NET: Registered PF_XDP protocol family
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 16 16:13:26 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 16 16:13:26 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 16 16:13:26 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 16 16:13:26 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 50502 usecs
Feb 16 16:13:26 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 16 16:13:26 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 16 16:13:26 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 16 16:13:26 localhost kernel: ACPI: bus type thunderbolt registered
Feb 16 16:13:26 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 16 16:13:26 localhost kernel: Initialise system trusted keyrings
Feb 16 16:13:26 localhost kernel: Key type blacklist registered
Feb 16 16:13:26 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Feb 16 16:13:26 localhost kernel: zbud: loaded
Feb 16 16:13:26 localhost kernel: integrity: Platform Keyring initialized
Feb 16 16:13:26 localhost kernel: integrity: Machine keyring initialized
Feb 16 16:13:26 localhost kernel: Freeing initrd memory: 234008K
Feb 16 16:13:26 localhost kernel: NET: Registered PF_ALG protocol family
Feb 16 16:13:26 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 16 16:13:26 localhost kernel: Key type asymmetric registered
Feb 16 16:13:26 localhost kernel: Asymmetric key parser 'x509' registered
Feb 16 16:13:26 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 16 16:13:26 localhost kernel: io scheduler mq-deadline registered
Feb 16 16:13:26 localhost kernel: io scheduler kyber registered
Feb 16 16:13:26 localhost kernel: io scheduler bfq registered
Feb 16 16:13:26 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 16 16:13:26 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 16 16:13:26 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 16 16:13:26 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 16 16:13:26 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 16 16:13:26 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 16 16:13:26 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 16 16:13:26 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 16 16:13:26 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 16 16:13:26 localhost kernel: Non-volatile memory driver v1.3
Feb 16 16:13:26 localhost kernel: rdac: device handler registered
Feb 16 16:13:26 localhost kernel: hp_sw: device handler registered
Feb 16 16:13:26 localhost kernel: emc: device handler registered
Feb 16 16:13:26 localhost kernel: alua: device handler registered
Feb 16 16:13:26 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 16 16:13:26 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 16 16:13:26 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 16 16:13:26 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 16 16:13:26 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 16 16:13:26 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 16 16:13:26 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 16 16:13:26 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-677.el9.x86_64 uhci_hcd
Feb 16 16:13:26 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 16 16:13:26 localhost kernel: hub 1-0:1.0: USB hub found
Feb 16 16:13:26 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 16 16:13:26 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 16 16:13:26 localhost kernel: usbserial: USB Serial support registered for generic
Feb 16 16:13:26 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 16 16:13:26 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 16 16:13:26 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 16 16:13:26 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 16 16:13:26 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 16 16:13:26 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 16 16:13:26 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 16 16:13:26 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-16T16:13:25 UTC (1771258405)
Feb 16 16:13:26 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 16 16:13:26 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Feb 16 16:13:26 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 16 16:13:26 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 16 16:13:26 localhost kernel: usbcore: registered new interface driver usbhid
Feb 16 16:13:26 localhost kernel: usbhid: USB HID core driver
Feb 16 16:13:26 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 16 16:13:26 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 16 16:13:26 localhost kernel: Initializing XFRM netlink socket
Feb 16 16:13:26 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 16 16:13:26 localhost kernel: Segment Routing with IPv6
Feb 16 16:13:26 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 16 16:13:26 localhost kernel: mpls_gso: MPLS GSO support
Feb 16 16:13:26 localhost kernel: IPI shorthand broadcast: enabled
Feb 16 16:13:26 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 16 16:13:26 localhost kernel: AES CTR mode by8 optimization enabled
Feb 16 16:13:26 localhost kernel: sched_clock: Marking stable (1131001700, 142372750)->(1392502750, -119128300)
Feb 16 16:13:26 localhost kernel: registered taskstats version 1
Feb 16 16:13:26 localhost kernel: Loading compiled-in X.509 certificates
Feb 16 16:13:26 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 59012b35a0d3f62f49a40ad60f91f66a06ca3be0'
Feb 16 16:13:26 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 16 16:13:26 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 16 16:13:26 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Feb 16 16:13:26 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Feb 16 16:13:26 localhost kernel: Demotion targets for Node 0: null
Feb 16 16:13:26 localhost kernel: page_owner is disabled
Feb 16 16:13:26 localhost kernel: Key type .fscrypt registered
Feb 16 16:13:26 localhost kernel: Key type fscrypt-provisioning registered
Feb 16 16:13:26 localhost kernel: Key type big_key registered
Feb 16 16:13:26 localhost kernel: Key type encrypted registered
Feb 16 16:13:26 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 16 16:13:26 localhost kernel: Loading compiled-in module X.509 certificates
Feb 16 16:13:26 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 59012b35a0d3f62f49a40ad60f91f66a06ca3be0'
Feb 16 16:13:26 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 16 16:13:26 localhost kernel: ima: No architecture policies found
Feb 16 16:13:26 localhost kernel: evm: Initialising EVM extended attributes:
Feb 16 16:13:26 localhost kernel: evm: security.selinux
Feb 16 16:13:26 localhost kernel: evm: security.SMACK64 (disabled)
Feb 16 16:13:26 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 16 16:13:26 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 16 16:13:26 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 16 16:13:26 localhost kernel: evm: security.apparmor (disabled)
Feb 16 16:13:26 localhost kernel: evm: security.ima
Feb 16 16:13:26 localhost kernel: evm: security.capability
Feb 16 16:13:26 localhost kernel: evm: HMAC attrs: 0x1
Feb 16 16:13:26 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 16 16:13:26 localhost kernel: Running certificate verification RSA selftest
Feb 16 16:13:26 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 16 16:13:26 localhost kernel: Running certificate verification ECDSA selftest
Feb 16 16:13:26 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Feb 16 16:13:26 localhost kernel: clk: Disabling unused clocks
Feb 16 16:13:26 localhost kernel: Freeing unused decrypted memory: 2028K
Feb 16 16:13:26 localhost kernel: Freeing unused kernel image (initmem) memory: 4204K
Feb 16 16:13:26 localhost kernel: Write protecting the kernel read-only data: 30720k
Feb 16 16:13:26 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 392K
Feb 16 16:13:26 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 16 16:13:26 localhost kernel: Run /init as init process
Feb 16 16:13:26 localhost kernel:   with arguments:
Feb 16 16:13:26 localhost kernel:     /init
Feb 16 16:13:26 localhost kernel:   with environment:
Feb 16 16:13:26 localhost kernel:     HOME=/
Feb 16 16:13:26 localhost kernel:     TERM=linux
Feb 16 16:13:26 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64
Feb 16 16:13:26 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 16 16:13:26 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 16 16:13:26 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 16 16:13:26 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 16 16:13:26 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 16 16:13:26 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 16 16:13:26 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 16 16:13:26 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 16 16:13:26 localhost systemd[1]: Detected virtualization kvm.
Feb 16 16:13:26 localhost systemd[1]: Detected architecture x86-64.
Feb 16 16:13:26 localhost systemd[1]: Running in initrd.
Feb 16 16:13:26 localhost systemd[1]: No hostname configured, using default hostname.
Feb 16 16:13:26 localhost systemd[1]: Hostname set to <localhost>.
Feb 16 16:13:26 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 16 16:13:26 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 16 16:13:26 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 16 16:13:26 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 16 16:13:26 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 16 16:13:26 localhost systemd[1]: Reached target Local File Systems.
Feb 16 16:13:26 localhost systemd[1]: Reached target Path Units.
Feb 16 16:13:26 localhost systemd[1]: Reached target Slice Units.
Feb 16 16:13:26 localhost systemd[1]: Reached target Swaps.
Feb 16 16:13:26 localhost systemd[1]: Reached target Timer Units.
Feb 16 16:13:26 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 16 16:13:26 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 16 16:13:26 localhost systemd[1]: Listening on Journal Socket.
Feb 16 16:13:26 localhost systemd[1]: Listening on udev Control Socket.
Feb 16 16:13:26 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 16 16:13:26 localhost systemd[1]: Reached target Socket Units.
Feb 16 16:13:26 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 16 16:13:26 localhost systemd[1]: Starting Journal Service...
Feb 16 16:13:26 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 16 16:13:26 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 16 16:13:26 localhost systemd[1]: Starting Create System Users...
Feb 16 16:13:26 localhost systemd[1]: Starting Setup Virtual Console...
Feb 16 16:13:26 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 16 16:13:26 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 16 16:13:26 localhost systemd[1]: Finished Create System Users.
Feb 16 16:13:26 localhost systemd-journald[305]: Journal started
Feb 16 16:13:26 localhost systemd-journald[305]: Runtime Journal (/run/log/journal/a72ae0da02c047299eb8f910b339152d) is 8.0M, max 153.6M, 145.6M free.
Feb 16 16:13:26 localhost systemd-sysusers[309]: Creating group 'users' with GID 100.
Feb 16 16:13:26 localhost systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Feb 16 16:13:26 localhost systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 16 16:13:26 localhost systemd[1]: Started Journal Service.
Feb 16 16:13:26 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 16 16:13:26 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 16 16:13:26 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 16 16:13:26 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 16 16:13:26 localhost systemd[1]: Finished Setup Virtual Console.
Feb 16 16:13:26 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 16 16:13:26 localhost systemd[1]: Starting dracut cmdline hook...
Feb 16 16:13:26 localhost dracut-cmdline[323]: dracut-9 dracut-057-110.git20260130.el9
Feb 16 16:13:26 localhost dracut-cmdline[323]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-677.el9.x86_64 root=UUID=19ee07ed-c14b-4aa3-804d-f2cbdae2694f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 16 16:13:26 localhost systemd[1]: Finished dracut cmdline hook.
Feb 16 16:13:26 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 16 16:13:26 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 16 16:13:26 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 16 16:13:26 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Feb 16 16:13:26 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 16 16:13:26 localhost kernel: RPC: Registered udp transport module.
Feb 16 16:13:26 localhost kernel: RPC: Registered tcp transport module.
Feb 16 16:13:26 localhost kernel: RPC: Registered tcp-with-tls transport module.
Feb 16 16:13:26 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 16 16:13:26 localhost rpc.statd[441]: Version 2.5.4 starting
Feb 16 16:13:26 localhost rpc.statd[441]: Initializing NSM state
Feb 16 16:13:26 localhost rpc.idmapd[446]: Setting log level to 0
Feb 16 16:13:26 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 16 16:13:27 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 16 16:13:27 localhost systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Feb 16 16:13:27 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 16 16:13:27 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 16 16:13:27 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 16 16:13:27 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 16 16:13:27 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 16 16:13:27 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 16 16:13:27 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 16 16:13:27 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 16 16:13:27 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 16 16:13:27 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 16 16:13:27 localhost systemd[1]: Reached target Network.
Feb 16 16:13:27 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 16 16:13:27 localhost systemd[1]: Starting dracut initqueue hook...
Feb 16 16:13:27 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 16 16:13:27 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 16 16:13:27 localhost systemd[1]: Reached target System Initialization.
Feb 16 16:13:27 localhost systemd[1]: Reached target Basic System.
Feb 16 16:13:27 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Feb 16 16:13:27 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Feb 16 16:13:27 localhost kernel: libata version 3.00 loaded.
Feb 16 16:13:27 localhost kernel:  vda: vda1
Feb 16 16:13:27 localhost kernel: ACPI: bus type drm_connector registered
Feb 16 16:13:27 localhost systemd-udevd[486]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 16:13:27 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 16 16:13:27 localhost kernel: scsi host0: ata_piix
Feb 16 16:13:27 localhost kernel: scsi host1: ata_piix
Feb 16 16:13:27 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Feb 16 16:13:27 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Feb 16 16:13:27 localhost systemd[1]: Found device /dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f.
Feb 16 16:13:27 localhost systemd[1]: Reached target Initrd Root Device.
Feb 16 16:13:27 localhost kernel: ata1: found unknown device (class 0)
Feb 16 16:13:27 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 16 16:13:27 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 16 16:13:27 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 16 16:13:27 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 16 16:13:27 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 16 16:13:27 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 16 16:13:27 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 16 16:13:27 localhost kernel: Console: switching to colour dummy device 80x25
Feb 16 16:13:27 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 16 16:13:27 localhost kernel: [drm] features: -context_init
Feb 16 16:13:27 localhost kernel: [drm] number of scanouts: 1
Feb 16 16:13:27 localhost kernel: [drm] number of cap sets: 0
Feb 16 16:13:27 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Feb 16 16:13:27 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Feb 16 16:13:27 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 16 16:13:27 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 16 16:13:27 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 16 16:13:27 localhost systemd[1]: Finished dracut initqueue hook.
Feb 16 16:13:27 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 16 16:13:27 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 16 16:13:27 localhost systemd[1]: Reached target Remote File Systems.
Feb 16 16:13:27 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 16 16:13:27 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 16 16:13:27 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f...
Feb 16 16:13:27 localhost systemd-fsck[567]: /usr/sbin/fsck.xfs: XFS file system.
Feb 16 16:13:27 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f.
Feb 16 16:13:27 localhost systemd[1]: Mounting /sysroot...
Feb 16 16:13:28 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 16 16:13:28 localhost kernel: XFS (vda1): Mounting V5 Filesystem 19ee07ed-c14b-4aa3-804d-f2cbdae2694f
Feb 16 16:13:28 localhost kernel: XFS (vda1): Ending clean mount
Feb 16 16:13:28 localhost systemd[1]: Mounted /sysroot.
Feb 16 16:13:28 localhost systemd[1]: Reached target Initrd Root File System.
Feb 16 16:13:28 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 16 16:13:28 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 16 16:13:28 localhost systemd[1]: Reached target Initrd File Systems.
Feb 16 16:13:28 localhost systemd[1]: Reached target Initrd Default Target.
Feb 16 16:13:28 localhost systemd[1]: Starting dracut mount hook...
Feb 16 16:13:28 localhost systemd[1]: Finished dracut mount hook.
Feb 16 16:13:28 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 16 16:13:28 localhost rpc.idmapd[446]: exiting on signal 15
Feb 16 16:13:28 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 16 16:13:28 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 16 16:13:28 localhost systemd[1]: Stopped target Network.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Timer Units.
Feb 16 16:13:28 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 16 16:13:28 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Basic System.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Path Units.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Remote File Systems.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Slice Units.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Socket Units.
Feb 16 16:13:28 localhost systemd[1]: Stopped target System Initialization.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Local File Systems.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Swaps.
Feb 16 16:13:28 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped dracut mount hook.
Feb 16 16:13:28 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 16 16:13:28 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 16 16:13:28 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 16 16:13:28 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 16 16:13:28 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 16 16:13:28 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 16 16:13:28 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 16 16:13:28 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 16 16:13:28 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 16 16:13:28 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 16 16:13:28 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 16 16:13:28 localhost systemd[1]: systemd-udevd.service: Consumed 1.228s CPU time.
Feb 16 16:13:28 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Closed udev Control Socket.
Feb 16 16:13:28 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Closed udev Kernel Socket.
Feb 16 16:13:28 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 16 16:13:28 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 16 16:13:28 localhost systemd[1]: Starting Cleanup udev Database...
Feb 16 16:13:28 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 16 16:13:28 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 16 16:13:28 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Stopped Create System Users.
Feb 16 16:13:28 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 16 16:13:28 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 16 16:13:28 localhost systemd[1]: Finished Cleanup udev Database.
Feb 16 16:13:28 localhost systemd[1]: Reached target Switch Root.
Feb 16 16:13:28 localhost systemd[1]: Starting Switch Root...
Feb 16 16:13:28 localhost systemd[1]: Switching root.
Feb 16 16:13:28 localhost systemd-journald[305]: Journal stopped
Feb 16 16:13:29 localhost systemd-journald[305]: Received SIGTERM from PID 1 (systemd).
Feb 16 16:13:29 localhost kernel: audit: type=1404 audit(1771258408.828:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 16 16:13:29 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 16:13:29 localhost kernel: SELinux:  policy capability open_perms=1
Feb 16 16:13:29 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 16:13:29 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 16 16:13:29 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 16:13:29 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 16:13:29 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 16:13:29 localhost kernel: audit: type=1403 audit(1771258408.946:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 16 16:13:29 localhost systemd[1]: Successfully loaded SELinux policy in 121.461ms.
Feb 16 16:13:29 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 31.235ms.
Feb 16 16:13:29 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 16 16:13:29 localhost systemd[1]: Detected virtualization kvm.
Feb 16 16:13:29 localhost systemd[1]: Detected architecture x86-64.
Feb 16 16:13:29 localhost systemd-rc-local-generator[652]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 16:13:29 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 16 16:13:29 localhost systemd[1]: Stopped Switch Root.
Feb 16 16:13:29 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 16 16:13:29 localhost systemd[1]: Created slice Slice /system/getty.
Feb 16 16:13:29 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 16 16:13:29 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 16 16:13:29 localhost systemd[1]: Created slice User and Session Slice.
Feb 16 16:13:29 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 16 16:13:29 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 16 16:13:29 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 16 16:13:29 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 16 16:13:29 localhost systemd[1]: Stopped target Switch Root.
Feb 16 16:13:29 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 16 16:13:29 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 16 16:13:29 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 16 16:13:29 localhost systemd[1]: Reached target Path Units.
Feb 16 16:13:29 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 16 16:13:29 localhost systemd[1]: Reached target Slice Units.
Feb 16 16:13:29 localhost systemd[1]: Reached target Swaps.
Feb 16 16:13:29 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 16 16:13:29 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 16 16:13:29 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 16 16:13:29 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 16 16:13:29 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 16 16:13:29 localhost systemd[1]: Listening on udev Control Socket.
Feb 16 16:13:29 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 16 16:13:29 localhost systemd[1]: Mounting Huge Pages File System...
Feb 16 16:13:29 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 16 16:13:29 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 16 16:13:29 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 16 16:13:29 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 16 16:13:29 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 16 16:13:29 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 16 16:13:29 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 16 16:13:29 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Feb 16 16:13:29 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 16 16:13:29 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 16 16:13:29 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 16 16:13:29 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 16 16:13:29 localhost systemd[1]: Stopped Journal Service.
Feb 16 16:13:29 localhost systemd[1]: Starting Journal Service...
Feb 16 16:13:29 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 16 16:13:29 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 16 16:13:29 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 16 16:13:29 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 16 16:13:29 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 16 16:13:29 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 16 16:13:29 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 16 16:13:29 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 16 16:13:29 localhost kernel: fuse: init (API version 7.37)
Feb 16 16:13:29 localhost systemd[1]: Mounted Huge Pages File System.
Feb 16 16:13:29 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 16 16:13:29 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 16 16:13:29 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 16 16:13:29 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 16 16:13:29 localhost systemd-journald[700]: Journal started
Feb 16 16:13:29 localhost systemd-journald[700]: Runtime Journal (/run/log/journal/c582f88d1fdab2d576c3dadef84540f2) is 8.0M, max 153.6M, 145.6M free.
Feb 16 16:13:29 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 16 16:13:29 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 16 16:13:29 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 16 16:13:29 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 16 16:13:29 localhost systemd[1]: Started Journal Service.
Feb 16 16:13:29 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 16 16:13:29 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 16 16:13:29 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Feb 16 16:13:29 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Feb 16 16:13:29 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 16 16:13:29 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 16 16:13:29 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 16 16:13:29 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 16 16:13:29 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 16 16:13:29 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 16 16:13:29 localhost systemd[1]: Mounting FUSE Control File System...
Feb 16 16:13:29 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 16 16:13:29 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 16 16:13:29 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 16 16:13:29 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Feb 16 16:13:29 localhost systemd[1]: Starting Load/Save OS Random Seed...
Feb 16 16:13:29 localhost systemd[1]: Starting Create System Users...
Feb 16 16:13:29 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 16 16:13:29 localhost systemd-journald[700]: Runtime Journal (/run/log/journal/c582f88d1fdab2d576c3dadef84540f2) is 8.0M, max 153.6M, 145.6M free.
Feb 16 16:13:29 localhost systemd-journald[700]: Received client request to flush runtime journal.
Feb 16 16:13:29 localhost systemd[1]: Mounted FUSE Control File System.
Feb 16 16:13:29 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 16 16:13:29 localhost systemd[1]: Finished Load/Save OS Random Seed.
Feb 16 16:13:29 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 16 16:13:29 localhost systemd[1]: Finished Create System Users.
Feb 16 16:13:29 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 16 16:13:29 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 16 16:13:29 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 16 16:13:29 localhost systemd[1]: Reached target Local File Systems.
Feb 16 16:13:29 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 16 16:13:29 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 16 16:13:29 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 16 16:13:29 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Feb 16 16:13:29 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 16 16:13:29 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 16 16:13:29 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 16 16:13:29 localhost bootctl[717]: Couldn't find EFI system partition, skipping.
Feb 16 16:13:29 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 16 16:13:30 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 16 16:13:30 localhost systemd[1]: Starting Security Auditing Service...
Feb 16 16:13:30 localhost systemd[1]: Starting RPC Bind...
Feb 16 16:13:30 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 16 16:13:30 localhost auditd[723]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Feb 16 16:13:30 localhost auditd[723]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Feb 16 16:13:30 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 16 16:13:30 localhost systemd[1]: Started RPC Bind.
Feb 16 16:13:30 localhost augenrules[728]: /sbin/augenrules: No change
Feb 16 16:13:30 localhost augenrules[743]: No rules
Feb 16 16:13:30 localhost augenrules[743]: enabled 1
Feb 16 16:13:30 localhost augenrules[743]: failure 1
Feb 16 16:13:30 localhost augenrules[743]: pid 723
Feb 16 16:13:30 localhost augenrules[743]: rate_limit 0
Feb 16 16:13:30 localhost augenrules[743]: backlog_limit 8192
Feb 16 16:13:30 localhost augenrules[743]: lost 0
Feb 16 16:13:30 localhost augenrules[743]: backlog 4
Feb 16 16:13:30 localhost augenrules[743]: backlog_wait_time 60000
Feb 16 16:13:30 localhost augenrules[743]: backlog_wait_time_actual 0
Feb 16 16:13:30 localhost augenrules[743]: enabled 1
Feb 16 16:13:30 localhost augenrules[743]: failure 1
Feb 16 16:13:30 localhost augenrules[743]: pid 723
Feb 16 16:13:30 localhost augenrules[743]: rate_limit 0
Feb 16 16:13:30 localhost augenrules[743]: backlog_limit 8192
Feb 16 16:13:30 localhost augenrules[743]: lost 0
Feb 16 16:13:30 localhost augenrules[743]: backlog 4
Feb 16 16:13:30 localhost augenrules[743]: backlog_wait_time 60000
Feb 16 16:13:30 localhost augenrules[743]: backlog_wait_time_actual 0
Feb 16 16:13:30 localhost augenrules[743]: enabled 1
Feb 16 16:13:30 localhost augenrules[743]: failure 1
Feb 16 16:13:30 localhost augenrules[743]: pid 723
Feb 16 16:13:30 localhost augenrules[743]: rate_limit 0
Feb 16 16:13:30 localhost augenrules[743]: backlog_limit 8192
Feb 16 16:13:30 localhost augenrules[743]: lost 0
Feb 16 16:13:30 localhost augenrules[743]: backlog 4
Feb 16 16:13:30 localhost augenrules[743]: backlog_wait_time 60000
Feb 16 16:13:30 localhost augenrules[743]: backlog_wait_time_actual 0
Feb 16 16:13:30 localhost systemd[1]: Started Security Auditing Service.
Feb 16 16:13:30 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 16 16:13:30 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 16 16:13:30 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 16 16:13:30 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 16 16:13:30 localhost systemd-udevd[751]: Using default interface naming scheme 'rhel-9.0'.
Feb 16 16:13:30 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 16 16:13:30 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 16 16:13:30 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 16 16:13:30 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 16 16:13:30 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 16 16:13:30 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 16 16:13:30 localhost systemd-udevd[754]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 16:13:30 localhost systemd[1]: Starting Update is Completed...
Feb 16 16:13:30 localhost systemd[1]: Finished Update is Completed.
Feb 16 16:13:30 localhost systemd[1]: Reached target System Initialization.
Feb 16 16:13:30 localhost systemd[1]: Started dnf makecache --timer.
Feb 16 16:13:30 localhost systemd[1]: Started Daily rotation of log files.
Feb 16 16:13:30 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 16 16:13:30 localhost systemd[1]: Reached target Timer Units.
Feb 16 16:13:30 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 16 16:13:30 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 16 16:13:30 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 16 16:13:30 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 16 16:13:30 localhost systemd[1]: Reached target Socket Units.
Feb 16 16:13:30 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Feb 16 16:13:30 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Feb 16 16:13:30 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 16 16:13:30 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 16 16:13:30 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 16 16:13:30 localhost systemd[1]: Reached target Basic System.
Feb 16 16:13:30 localhost dbus-broker-lau[797]: Ready
Feb 16 16:13:30 localhost systemd[1]: Starting NTP client/server...
Feb 16 16:13:30 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Feb 16 16:13:30 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 16 16:13:30 localhost systemd[1]: Starting IPv4 firewall with iptables...
Feb 16 16:13:30 localhost systemd[1]: Started irqbalance daemon.
Feb 16 16:13:30 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 16 16:13:30 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 16:13:30 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 16:13:30 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 16:13:30 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 16 16:13:30 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 16 16:13:30 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 16 16:13:30 localhost systemd[1]: Starting User Login Management...
Feb 16 16:13:30 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 16 16:13:30 localhost kernel: kvm_amd: TSC scaling supported
Feb 16 16:13:30 localhost kernel: kvm_amd: Nested Virtualization enabled
Feb 16 16:13:30 localhost kernel: kvm_amd: Nested Paging enabled
Feb 16 16:13:30 localhost kernel: kvm_amd: LBR virtualization supported
Feb 16 16:13:30 localhost chronyd[829]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 16 16:13:30 localhost chronyd[829]: Loaded 0 symmetric keys
Feb 16 16:13:30 localhost chronyd[829]: Using right/UTC timezone to obtain leap second data
Feb 16 16:13:30 localhost chronyd[829]: Loaded seccomp filter (level 2)
Feb 16 16:13:30 localhost systemd[1]: Started NTP client/server.
Feb 16 16:13:30 localhost systemd-logind[821]: New seat seat0.
Feb 16 16:13:30 localhost systemd-logind[821]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 16 16:13:30 localhost systemd-logind[821]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 16 16:13:30 localhost systemd[1]: Started User Login Management.
Feb 16 16:13:30 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 16 16:13:30 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Feb 16 16:13:30 localhost iptables.init[815]: iptables: Applying firewall rules: [  OK  ]
Feb 16 16:13:30 localhost systemd[1]: Finished IPv4 firewall with iptables.
Feb 16 16:13:31 localhost cloud-init[854]: Cloud-init v. 24.4-8.el9 running 'init-local' at Mon, 16 Feb 2026 16:13:31 +0000. Up 6.78 seconds.
Feb 16 16:13:31 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 16 16:13:31 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 16 16:13:31 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpx5bszncm.mount: Deactivated successfully.
Feb 16 16:13:31 localhost systemd[1]: Starting Hostname Service...
Feb 16 16:13:31 localhost systemd[1]: Started Hostname Service.
Feb 16 16:13:31 np0005621130.novalocal systemd-hostnamed[868]: Hostname set to <np0005621130.novalocal> (static)
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Reached target Preparation for Network.
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Starting Network Manager...
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.7793] NetworkManager (version 1.54.3-2.el9) is starting... (boot:b69df90a-35c3-4c3f-8202-6e7c0e72a85a)
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.7798] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.7955] manager[0x56116ecc6000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8014] hostname: hostname: using hostnamed
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8015] hostname: static hostname changed from (none) to "np0005621130.novalocal"
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8021] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8133] manager[0x56116ecc6000]: rfkill: Wi-Fi hardware radio set enabled
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8134] manager[0x56116ecc6000]: rfkill: WWAN hardware radio set enabled
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8224] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8224] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8225] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8226] manager: Networking is enabled by state file
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8228] settings: Loaded settings plugin: keyfile (internal)
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8263] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8297] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8320] dhcp: init: Using DHCP client 'internal'
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8325] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8344] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8360] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8376] device (lo): Activation: starting connection 'lo' (06a79a63-f313-4cf6-b532-81d7b1898ab4)
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8389] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8394] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8432] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8438] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8443] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8446] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Started Network Manager.
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8450] device (eth0): carrier: link connected
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8456] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8465] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8476] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Reached target Network.
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8483] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8484] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8488] manager: NetworkManager state is now CONNECTING
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8491] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8504] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8508] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8766] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8771] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 16 16:13:31 np0005621130.novalocal NetworkManager[872]: <info>  [1771258411.8779] device (lo): Activation: successful, device activated.
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Reached target NFS client services.
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: Reached target Remote File Systems.
Feb 16 16:13:31 np0005621130.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 16 16:13:32 np0005621130.novalocal NetworkManager[872]: <info>  [1771258412.9029] dhcp4 (eth0): state changed new lease, address=38.102.83.146
Feb 16 16:13:32 np0005621130.novalocal NetworkManager[872]: <info>  [1771258412.9043] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 16 16:13:32 np0005621130.novalocal NetworkManager[872]: <info>  [1771258412.9069] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 16:13:32 np0005621130.novalocal NetworkManager[872]: <info>  [1771258412.9104] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 16:13:32 np0005621130.novalocal NetworkManager[872]: <info>  [1771258412.9106] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 16:13:32 np0005621130.novalocal NetworkManager[872]: <info>  [1771258412.9111] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 16:13:32 np0005621130.novalocal NetworkManager[872]: <info>  [1771258412.9115] device (eth0): Activation: successful, device activated.
Feb 16 16:13:32 np0005621130.novalocal NetworkManager[872]: <info>  [1771258412.9123] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 16 16:13:32 np0005621130.novalocal NetworkManager[872]: <info>  [1771258412.9130] manager: startup complete
Feb 16 16:13:32 np0005621130.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 16 16:13:32 np0005621130.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: Cloud-init v. 24.4-8.el9 running 'init' at Mon, 16 Feb 2026 16:13:33 +0000. Up 8.75 seconds.
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: |  eth0  | True |        38.102.83.146         | 255.255.255.0 | global | fa:16:3e:2c:ae:88 |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: |  eth0  | True | fe80::f816:3eff:fe2c:ae88/64 |       .       |  link  | fa:16:3e:2c:ae:88 |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Feb 16 16:13:33 np0005621130.novalocal cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 16 16:13:34 np0005621130.novalocal useradd[1003]: new group: name=cloud-user, GID=1001
Feb 16 16:13:34 np0005621130.novalocal useradd[1003]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 16 16:13:34 np0005621130.novalocal useradd[1003]: add 'cloud-user' to group 'adm'
Feb 16 16:13:34 np0005621130.novalocal useradd[1003]: add 'cloud-user' to group 'systemd-journal'
Feb 16 16:13:34 np0005621130.novalocal useradd[1003]: add 'cloud-user' to shadow group 'adm'
Feb 16 16:13:34 np0005621130.novalocal useradd[1003]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: Generating public/private rsa key pair.
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: The key fingerprint is:
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: SHA256:VKXVxR2KDEm04dMLsp4FSskhDmlJjQ2CWYEcghaBY1s root@np0005621130.novalocal
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: The key's randomart image is:
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: +---[RSA 3072]----+
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |O@&o .  o=o.o. ++|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |XOoEo o .o*o. o o|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |+.o. + o.=.+ .   |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: | .  . ..+ o .    |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |     . .S. .     |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |      . o        |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |       o         |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |                 |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |                 |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: +----[SHA256]-----+
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: Generating public/private ecdsa key pair.
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: The key fingerprint is:
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: SHA256:fzlXLW3DPptp28L4QnIZmG27BfIqXWPgPG+UHv80aq0 root@np0005621130.novalocal
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: The key's randomart image is:
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: +---[ECDSA 256]---+
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |                 |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |                 |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |           +     |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |          = = ...|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |        So = *.o=|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |         .= &.o+.|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |         ..%+Oo+.|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |        . o.B+=+B|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |         . ..E+B+|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: +----[SHA256]-----+
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: Generating public/private ed25519 key pair.
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: The key fingerprint is:
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: SHA256:o9cYJL31a5XTDUJdTbSzdwYNxNTA1+YL2zrA6vjBuks root@np0005621130.novalocal
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: The key's randomart image is:
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: +--[ED25519 256]--+
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |            .*=**|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |       .   .  +oB|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |      . o . . o*.|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |       o o . o ==|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |        S . . B.B|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |       ..= o + =o|
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |      .Eooo + .  |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |      ..o... o   |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: |       ==o    .  |
Feb 16 16:13:34 np0005621130.novalocal cloud-init[936]: +----[SHA256]-----+
Feb 16 16:13:34 np0005621130.novalocal sm-notify[1019]: Version 2.5.4 starting
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Feb 16 16:13:34 np0005621130.novalocal sshd[1021]: Server listening on 0.0.0.0 port 22.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 16 16:13:34 np0005621130.novalocal sshd[1021]: Server listening on :: port 22.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Reached target Network is Online.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Starting System Logging Service...
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Starting Permit User Sessions...
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Finished Permit User Sessions.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Started Command Scheduler.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Started Getty on tty1.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 16 16:13:34 np0005621130.novalocal crond[1024]: (CRON) STARTUP (1.5.7)
Feb 16 16:13:34 np0005621130.novalocal crond[1024]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Reached target Login Prompts.
Feb 16 16:13:34 np0005621130.novalocal crond[1024]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 66% if used.)
Feb 16 16:13:34 np0005621130.novalocal crond[1024]: (CRON) INFO (running with inotify support)
Feb 16 16:13:34 np0005621130.novalocal rsyslogd[1020]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1020" x-info="https://www.rsyslog.com"] start
Feb 16 16:13:34 np0005621130.novalocal rsyslogd[1020]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Started System Logging Service.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Reached target Multi-User System.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 16 16:13:34 np0005621130.novalocal sshd-session[1057]: Unable to negotiate with 38.102.83.114 port 52066: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 16 16:13:34 np0005621130.novalocal rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 16:13:34 np0005621130.novalocal sshd-session[1114]: Unable to negotiate with 38.102.83.114 port 52088: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 16 16:13:34 np0005621130.novalocal sshd-session[1028]: Connection closed by 38.102.83.114 port 52058 [preauth]
Feb 16 16:13:34 np0005621130.novalocal sshd-session[1123]: Unable to negotiate with 38.102.83.114 port 52104: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 16 16:13:34 np0005621130.novalocal sshd-session[1152]: Connection closed by 38.102.83.114 port 52116 [preauth]
Feb 16 16:13:34 np0005621130.novalocal sshd-session[1160]: Unable to negotiate with 38.102.83.114 port 52130: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Feb 16 16:13:34 np0005621130.novalocal sshd-session[1070]: Connection closed by 38.102.83.114 port 52082 [preauth]
Feb 16 16:13:34 np0005621130.novalocal sshd-session[1164]: Unable to negotiate with 38.102.83.114 port 52138: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 16 16:13:34 np0005621130.novalocal kdumpctl[1033]: kdump: No kdump initial ramdisk found.
Feb 16 16:13:34 np0005621130.novalocal kdumpctl[1033]: kdump: Rebuilding /boot/initramfs-5.14.0-677.el9.x86_64kdump.img
Feb 16 16:13:34 np0005621130.novalocal cloud-init[1184]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Mon, 16 Feb 2026 16:13:34 +0000. Up 10.49 seconds.
Feb 16 16:13:34 np0005621130.novalocal sshd-session[1136]: Connection closed by 38.102.83.114 port 52106 [preauth]
Feb 16 16:13:34 np0005621130.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Feb 16 16:13:35 np0005621130.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Feb 16 16:13:35 np0005621130.novalocal cloud-init[1468]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Mon, 16 Feb 2026 16:13:35 +0000. Up 10.85 seconds.
Feb 16 16:13:35 np0005621130.novalocal cloud-init[1498]: #############################################################
Feb 16 16:13:35 np0005621130.novalocal cloud-init[1501]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 16 16:13:35 np0005621130.novalocal cloud-init[1511]: 256 SHA256:fzlXLW3DPptp28L4QnIZmG27BfIqXWPgPG+UHv80aq0 root@np0005621130.novalocal (ECDSA)
Feb 16 16:13:35 np0005621130.novalocal cloud-init[1515]: 256 SHA256:o9cYJL31a5XTDUJdTbSzdwYNxNTA1+YL2zrA6vjBuks root@np0005621130.novalocal (ED25519)
Feb 16 16:13:35 np0005621130.novalocal cloud-init[1521]: 3072 SHA256:VKXVxR2KDEm04dMLsp4FSskhDmlJjQ2CWYEcghaBY1s root@np0005621130.novalocal (RSA)
Feb 16 16:13:35 np0005621130.novalocal cloud-init[1525]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 16 16:13:35 np0005621130.novalocal cloud-init[1526]: #############################################################
Feb 16 16:13:35 np0005621130.novalocal dracut[1539]: dracut-057-110.git20260130.el9
Feb 16 16:13:35 np0005621130.novalocal cloud-init[1468]: Cloud-init v. 24.4-8.el9 finished at Mon, 16 Feb 2026 16:13:35 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.03 seconds
Feb 16 16:13:35 np0005621130.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Feb 16 16:13:35 np0005621130.novalocal systemd[1]: Reached target Cloud-init target.
Feb 16 16:13:35 np0005621130.novalocal dracut[1541]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/19ee07ed-c14b-4aa3-804d-f2cbdae2694f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-677.el9.x86_64kdump.img 5.14.0-677.el9.x86_64
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: Module 'resume' will not be installed, because it's in the list to be omitted!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: memstrack is not available
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: memstrack is not available
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 16 16:13:36 np0005621130.novalocal dracut[1541]: *** Including module: systemd ***
Feb 16 16:13:37 np0005621130.novalocal dracut[1541]: *** Including module: fips ***
Feb 16 16:13:37 np0005621130.novalocal dracut[1541]: *** Including module: systemd-initrd ***
Feb 16 16:13:37 np0005621130.novalocal dracut[1541]: *** Including module: i18n ***
Feb 16 16:13:37 np0005621130.novalocal dracut[1541]: *** Including module: drm ***
Feb 16 16:13:37 np0005621130.novalocal chronyd[829]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Feb 16 16:13:37 np0005621130.novalocal chronyd[829]: System clock TAI offset set to 37 seconds
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]: *** Including module: prefixdevname ***
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]: *** Including module: kernel-modules ***
Feb 16 16:13:38 np0005621130.novalocal kernel: block vda: the capability attribute has been deprecated.
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]: *** Including module: kernel-modules-extra ***
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]: *** Including module: qemu ***
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]: *** Including module: fstab-sys ***
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]: *** Including module: rootfs-block ***
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]: *** Including module: terminfo ***
Feb 16 16:13:38 np0005621130.novalocal dracut[1541]: *** Including module: udev-rules ***
Feb 16 16:13:39 np0005621130.novalocal chronyd[829]: Selected source 149.56.19.163 (2.centos.pool.ntp.org)
Feb 16 16:13:39 np0005621130.novalocal dracut[1541]: Skipping udev rule: 91-permissions.rules
Feb 16 16:13:39 np0005621130.novalocal dracut[1541]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 16 16:13:39 np0005621130.novalocal dracut[1541]: *** Including module: virtiofs ***
Feb 16 16:13:39 np0005621130.novalocal dracut[1541]: *** Including module: dracut-systemd ***
Feb 16 16:13:39 np0005621130.novalocal dracut[1541]: *** Including module: usrmount ***
Feb 16 16:13:39 np0005621130.novalocal dracut[1541]: *** Including module: base ***
Feb 16 16:13:39 np0005621130.novalocal dracut[1541]: *** Including module: fs-lib ***
Feb 16 16:13:39 np0005621130.novalocal dracut[1541]: *** Including module: kdumpbase ***
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:   microcode_ctl module: mangling fw_dir
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: configuration "intel" is ignored
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: Cannot change IRQ 25 affinity: Operation not permitted
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: IRQ 25 affinity is now unmanaged
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: Cannot change IRQ 31 affinity: Operation not permitted
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: IRQ 31 affinity is now unmanaged
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: Cannot change IRQ 28 affinity: Operation not permitted
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: IRQ 28 affinity is now unmanaged
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: Cannot change IRQ 32 affinity: Operation not permitted
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: IRQ 32 affinity is now unmanaged
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: Cannot change IRQ 30 affinity: Operation not permitted
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: IRQ 30 affinity is now unmanaged
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: Cannot change IRQ 29 affinity: Operation not permitted
Feb 16 16:13:40 np0005621130.novalocal irqbalance[819]: IRQ 29 affinity is now unmanaged
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]: *** Including module: openssl ***
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]: *** Including module: shutdown ***
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]: *** Including module: squash ***
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]: *** Including modules done ***
Feb 16 16:13:40 np0005621130.novalocal dracut[1541]: *** Installing kernel module dependencies ***
Feb 16 16:13:41 np0005621130.novalocal dracut[1541]: *** Installing kernel module dependencies done ***
Feb 16 16:13:41 np0005621130.novalocal dracut[1541]: *** Resolving executable dependencies ***
Feb 16 16:13:42 np0005621130.novalocal dracut[1541]: *** Resolving executable dependencies done ***
Feb 16 16:13:42 np0005621130.novalocal dracut[1541]: *** Generating early-microcode cpio image ***
Feb 16 16:13:42 np0005621130.novalocal dracut[1541]: *** Store current command line parameters ***
Feb 16 16:13:42 np0005621130.novalocal dracut[1541]: Stored kernel commandline:
Feb 16 16:13:42 np0005621130.novalocal dracut[1541]: No dracut internal kernel commandline stored in the initramfs
Feb 16 16:13:42 np0005621130.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 16:13:43 np0005621130.novalocal dracut[1541]: *** Install squash loader ***
Feb 16 16:13:43 np0005621130.novalocal dracut[1541]: *** Squashing the files inside the initramfs ***
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: *** Squashing the files inside the initramfs done ***
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: *** Creating image file '/boot/initramfs-5.14.0-677.el9.x86_64kdump.img' ***
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: *** Hardlinking files ***
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: Mode:           real
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: Files:          50
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: Linked:         0 files
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: Compared:       0 xattrs
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: Compared:       0 files
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: Saved:          0 B
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: Duration:       0.000536 seconds
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: *** Hardlinking files done ***
Feb 16 16:13:44 np0005621130.novalocal dracut[1541]: *** Creating initramfs image file '/boot/initramfs-5.14.0-677.el9.x86_64kdump.img' done ***
Feb 16 16:13:45 np0005621130.novalocal kdumpctl[1033]: kdump: kexec: loaded kdump kernel
Feb 16 16:13:45 np0005621130.novalocal kdumpctl[1033]: kdump: Starting kdump: [OK]
Feb 16 16:13:45 np0005621130.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 16 16:13:45 np0005621130.novalocal systemd[1]: Startup finished in 1.527s (kernel) + 2.870s (initrd) + 16.642s (userspace) = 21.041s.
Feb 16 16:14:01 np0005621130.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 16:14:04 np0005621130.novalocal sshd-session[4798]: Accepted publickey for zuul from 38.102.83.114 port 60220 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 16 16:14:04 np0005621130.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 16 16:14:04 np0005621130.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 16 16:14:04 np0005621130.novalocal systemd-logind[821]: New session 1 of user zuul.
Feb 16 16:14:04 np0005621130.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 16 16:14:04 np0005621130.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Queued start job for default target Main User Target.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Created slice User Application Slice.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Reached target Paths.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Reached target Timers.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Starting D-Bus User Message Bus Socket...
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Starting Create User's Volatile Files and Directories...
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Finished Create User's Volatile Files and Directories.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Listening on D-Bus User Message Bus Socket.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Reached target Sockets.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Reached target Basic System.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Reached target Main User Target.
Feb 16 16:14:04 np0005621130.novalocal systemd[4802]: Startup finished in 112ms.
Feb 16 16:14:04 np0005621130.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 16 16:14:04 np0005621130.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 16 16:14:04 np0005621130.novalocal sshd-session[4798]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 16:14:05 np0005621130.novalocal python3[4884]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:14:08 np0005621130.novalocal python3[4912]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:14:15 np0005621130.novalocal python3[4970]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:14:16 np0005621130.novalocal python3[5010]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 16 16:14:18 np0005621130.novalocal python3[5036]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDUJfFYxQCoBRlKt1DVEFwnAe0HVWXdOOfz32jz8KNUoaqNvrOePKNomFxUYpTXauhD22SvZnnEDsY5bJm3IkmSSbjMWuEpzWmPwGo7nLGAPKsXuUTgGBEr8QLGhkHe9O1jwtFyShNcQnYmdC4womAP6KIMJhkhxS2WVZXRxpFVbkWffGxb3eLVdtysMVBDGh0GsCVNJXuFhiRMswerczQbRd9FdxYv20jKPo8zZqs6e9HIiWPC22wWLLqhY9u1cV00Wt6U8AfxPzu0dTJK5l3/t8/C2Qs/yEpfvLrdp9kTuWP/ZM2z0wTB1+NXZ8+Fts2a5HFla3Q1GLgOgpVrsPSuref6zMvf6Aj1TvyALEqJ5JIw2BOGHifzFcfvwfYE/rgX5Hv5Y/ujQQcXjEY5ZYe3hEKX/7G6VoRFbWvgyB3THqpMtt590f+xgDcmRNXEw42I39CSpxrFbdGTVG7Y6NS7Fg0lKI9yIAfC/Rq204OA0Ciz9VWm7ndXWP2LBVnnSlM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:18 np0005621130.novalocal python3[5060]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:19 np0005621130.novalocal python3[5159]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:14:19 np0005621130.novalocal python3[5230]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771258459.1391356-229-254431350642644/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=d06035df7c78466dba3005be40ffa870_id_rsa follow=False checksum=9cf4db580c72decb915e1d5dfc482328f65cfe24 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:20 np0005621130.novalocal python3[5353]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:14:20 np0005621130.novalocal python3[5424]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771258460.0839474-273-3450584819225/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=d06035df7c78466dba3005be40ffa870_id_rsa.pub follow=False checksum=eae25d6d5590297e8495f1144c723746748b6649 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:22 np0005621130.novalocal python3[5472]: ansible-ping Invoked with data=pong
Feb 16 16:14:23 np0005621130.novalocal python3[5496]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:14:25 np0005621130.novalocal python3[5554]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 16 16:14:26 np0005621130.novalocal python3[5586]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:26 np0005621130.novalocal python3[5610]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:26 np0005621130.novalocal python3[5634]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:27 np0005621130.novalocal python3[5658]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:27 np0005621130.novalocal python3[5682]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:27 np0005621130.novalocal python3[5706]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:29 np0005621130.novalocal sudo[5730]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emxonkcgdirnixppcyhkfmqsybqsmvrk ; /usr/bin/python3'
Feb 16 16:14:29 np0005621130.novalocal sudo[5730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:14:29 np0005621130.novalocal python3[5732]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:29 np0005621130.novalocal sudo[5730]: pam_unix(sudo:session): session closed for user root
Feb 16 16:14:29 np0005621130.novalocal sudo[5808]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uupawtqpgbgpsldenpwgncfzcrvzoqni ; /usr/bin/python3'
Feb 16 16:14:29 np0005621130.novalocal sudo[5808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:14:30 np0005621130.novalocal python3[5810]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:14:30 np0005621130.novalocal sudo[5808]: pam_unix(sudo:session): session closed for user root
Feb 16 16:14:30 np0005621130.novalocal irqbalance[819]: Cannot change IRQ 26 affinity: Operation not permitted
Feb 16 16:14:30 np0005621130.novalocal irqbalance[819]: IRQ 26 affinity is now unmanaged
Feb 16 16:14:30 np0005621130.novalocal sudo[5881]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntyspnhetyxkgzyrsxsdrwsihigxrypt ; /usr/bin/python3'
Feb 16 16:14:30 np0005621130.novalocal sudo[5881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:14:30 np0005621130.novalocal python3[5883]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771258469.679722-26-45238525659270/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:30 np0005621130.novalocal sudo[5881]: pam_unix(sudo:session): session closed for user root
Feb 16 16:14:31 np0005621130.novalocal python3[5931]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:31 np0005621130.novalocal python3[5955]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:31 np0005621130.novalocal python3[5979]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:31 np0005621130.novalocal python3[6003]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:32 np0005621130.novalocal python3[6027]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:32 np0005621130.novalocal python3[6051]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:32 np0005621130.novalocal python3[6075]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:32 np0005621130.novalocal python3[6099]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:33 np0005621130.novalocal python3[6123]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:33 np0005621130.novalocal python3[6147]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:33 np0005621130.novalocal python3[6171]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:33 np0005621130.novalocal python3[6195]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:34 np0005621130.novalocal python3[6219]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:34 np0005621130.novalocal python3[6243]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:34 np0005621130.novalocal python3[6267]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:34 np0005621130.novalocal python3[6291]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:35 np0005621130.novalocal python3[6315]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:35 np0005621130.novalocal python3[6339]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:35 np0005621130.novalocal python3[6363]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:36 np0005621130.novalocal python3[6387]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:36 np0005621130.novalocal python3[6411]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:36 np0005621130.novalocal python3[6435]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:36 np0005621130.novalocal python3[6459]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:37 np0005621130.novalocal python3[6483]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:37 np0005621130.novalocal python3[6507]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:37 np0005621130.novalocal python3[6531]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:14:40 np0005621130.novalocal sudo[6555]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfwdnezkrchryomasarlibqghrafptlf ; /usr/bin/python3'
Feb 16 16:14:40 np0005621130.novalocal sudo[6555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:14:41 np0005621130.novalocal python3[6557]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 16 16:14:41 np0005621130.novalocal systemd[1]: Starting Time & Date Service...
Feb 16 16:14:41 np0005621130.novalocal systemd[1]: Started Time & Date Service.
Feb 16 16:14:41 np0005621130.novalocal systemd-timedated[6559]: Changed time zone to 'UTC' (UTC).
Feb 16 16:14:41 np0005621130.novalocal sudo[6555]: pam_unix(sudo:session): session closed for user root
Feb 16 16:14:41 np0005621130.novalocal sudo[6586]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bibeumelrewksgqygpbayvoqbcswddmv ; /usr/bin/python3'
Feb 16 16:14:41 np0005621130.novalocal sudo[6586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:14:41 np0005621130.novalocal python3[6588]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:41 np0005621130.novalocal sudo[6586]: pam_unix(sudo:session): session closed for user root
Feb 16 16:14:42 np0005621130.novalocal python3[6664]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:14:42 np0005621130.novalocal python3[6735]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1771258481.9153333-202-73840898104868/source _original_basename=tmpi69id70s follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:43 np0005621130.novalocal python3[6835]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:14:43 np0005621130.novalocal python3[6906]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771258482.81727-242-184978865604692/source _original_basename=tmp8o3_vj73 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:44 np0005621130.novalocal sudo[7006]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugobbheihowwztaakwwbrakckvkhzdol ; /usr/bin/python3'
Feb 16 16:14:44 np0005621130.novalocal sudo[7006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:14:44 np0005621130.novalocal python3[7008]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:14:44 np0005621130.novalocal sudo[7006]: pam_unix(sudo:session): session closed for user root
Feb 16 16:14:44 np0005621130.novalocal sudo[7079]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qphkwxlwwdygcbjokiouhfwljpeafaux ; /usr/bin/python3'
Feb 16 16:14:44 np0005621130.novalocal sudo[7079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:14:44 np0005621130.novalocal python3[7081]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771258484.1329324-306-181051416295890/source _original_basename=tmpl2_3jbk7 follow=False checksum=4a0ca57baae0ad7db719d1cf1b93f3d0aefbfe4f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:44 np0005621130.novalocal sudo[7079]: pam_unix(sudo:session): session closed for user root
Feb 16 16:14:45 np0005621130.novalocal python3[7129]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:14:45 np0005621130.novalocal python3[7155]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:14:46 np0005621130.novalocal sudo[7233]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkyxxdjcyulqfmuowmdfmekdxnyvysgu ; /usr/bin/python3'
Feb 16 16:14:46 np0005621130.novalocal sudo[7233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:14:46 np0005621130.novalocal python3[7235]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:14:46 np0005621130.novalocal sudo[7233]: pam_unix(sudo:session): session closed for user root
Feb 16 16:14:46 np0005621130.novalocal sudo[7306]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzhaipzkfxlqvjmerirsqsdzxriqnukr ; /usr/bin/python3'
Feb 16 16:14:46 np0005621130.novalocal sudo[7306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:14:46 np0005621130.novalocal python3[7308]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1771258485.8754015-362-160428473148852/source _original_basename=tmpfgbchlc0 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:14:46 np0005621130.novalocal sudo[7306]: pam_unix(sudo:session): session closed for user root
Feb 16 16:14:46 np0005621130.novalocal sudo[7357]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnnazqdzuvtigqmyrlwtafkvgeiqabod ; /usr/bin/python3'
Feb 16 16:14:46 np0005621130.novalocal sudo[7357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:14:47 np0005621130.novalocal python3[7359]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-5423-3d4b-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:14:47 np0005621130.novalocal sudo[7357]: pam_unix(sudo:session): session closed for user root
Feb 16 16:14:47 np0005621130.novalocal python3[7387]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-5423-3d4b-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 16 16:14:49 np0005621130.novalocal python3[7415]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:15:06 np0005621130.novalocal sudo[7439]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywiudtqxastdadzndggtytafevnnicwk ; /usr/bin/python3'
Feb 16 16:15:06 np0005621130.novalocal sudo[7439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:15:06 np0005621130.novalocal python3[7441]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:15:06 np0005621130.novalocal sudo[7439]: pam_unix(sudo:session): session closed for user root
Feb 16 16:15:11 np0005621130.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 16 16:15:41 np0005621130.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 16 16:15:41 np0005621130.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Feb 16 16:15:41 np0005621130.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Feb 16 16:15:41 np0005621130.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Feb 16 16:15:41 np0005621130.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Feb 16 16:15:41 np0005621130.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Feb 16 16:15:41 np0005621130.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Feb 16 16:15:41 np0005621130.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Feb 16 16:15:41 np0005621130.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Feb 16 16:15:41 np0005621130.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5291] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 16 16:15:41 np0005621130.novalocal systemd-udevd[7444]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5455] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5479] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5482] device (eth1): carrier: link connected
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5484] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5489] policy: auto-activating connection 'Wired connection 1' (5343486a-909f-3711-a4c4-9479ef6f81b7)
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5493] device (eth1): Activation: starting connection 'Wired connection 1' (5343486a-909f-3711-a4c4-9479ef6f81b7)
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5494] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5497] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5501] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 16:15:41 np0005621130.novalocal NetworkManager[872]: <info>  [1771258541.5505] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 16 16:15:42 np0005621130.novalocal python3[7471]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-8abf-3e9c-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:15:48 np0005621130.novalocal chronyd[829]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Feb 16 16:15:49 np0005621130.novalocal sudo[7549]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yopmoeahftocjkdrgmzsjreloyrcovsn ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 16:15:49 np0005621130.novalocal sudo[7549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:15:49 np0005621130.novalocal python3[7551]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:15:49 np0005621130.novalocal sudo[7549]: pam_unix(sudo:session): session closed for user root
Feb 16 16:15:49 np0005621130.novalocal sudo[7622]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tknqlqrehaxomfnzxdljnmhvvnlhylnp ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 16:15:49 np0005621130.novalocal sudo[7622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:15:49 np0005621130.novalocal python3[7624]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771258549.047211-103-147279761273884/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=3515d01c891564f95f78d58e25f059f055ae9956 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:15:49 np0005621130.novalocal sudo[7622]: pam_unix(sudo:session): session closed for user root
Feb 16 16:15:50 np0005621130.novalocal sudo[7672]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogcahrgcstxiqezfsszexjfiiozftxzn ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 16:15:50 np0005621130.novalocal sudo[7672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:15:50 np0005621130.novalocal python3[7674]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[872]: <info>  [1771258550.4442] caught SIGTERM, shutting down normally.
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Stopping Network Manager...
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[872]: <info>  [1771258550.4451] dhcp4 (eth0): canceled DHCP transaction
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[872]: <info>  [1771258550.4452] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[872]: <info>  [1771258550.4452] dhcp4 (eth0): state changed no lease
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[872]: <info>  [1771258550.4454] manager: NetworkManager state is now CONNECTING
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[872]: <info>  [1771258550.4631] dhcp4 (eth1): canceled DHCP transaction
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[872]: <info>  [1771258550.4631] dhcp4 (eth1): state changed no lease
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[872]: <info>  [1771258550.4694] exiting (success)
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Stopped Network Manager.
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Starting Network Manager...
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.5403] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:b69df90a-35c3-4c3f-8202-6e7c0e72a85a)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.5406] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.5450] manager[0x5585bf715000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Starting Hostname Service...
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Started Hostname Service.
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6296] hostname: hostname: using hostnamed
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6296] hostname: static hostname changed from (none) to "np0005621130.novalocal"
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6302] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6308] manager[0x5585bf715000]: rfkill: Wi-Fi hardware radio set enabled
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6308] manager[0x5585bf715000]: rfkill: WWAN hardware radio set enabled
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6346] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6347] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6347] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6348] manager: Networking is enabled by state file
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6351] settings: Loaded settings plugin: keyfile (internal)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6356] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6391] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6403] dhcp: init: Using DHCP client 'internal'
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6406] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6413] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6420] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6431] device (lo): Activation: starting connection 'lo' (06a79a63-f313-4cf6-b532-81d7b1898ab4)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6438] device (eth0): carrier: link connected
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6444] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6450] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6451] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6459] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6468] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6475] device (eth1): carrier: link connected
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6480] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6487] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (5343486a-909f-3711-a4c4-9479ef6f81b7) (indicated)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6487] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6494] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6503] device (eth1): Activation: starting connection 'Wired connection 1' (5343486a-909f-3711-a4c4-9479ef6f81b7)
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Started Network Manager.
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6511] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6516] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6532] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6535] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6539] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6544] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6548] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6551] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6557] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6571] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6578] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6592] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6599] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6628] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6635] dhcp4 (eth0): state changed new lease, address=38.102.83.146
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6645] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6658] device (lo): Activation: successful, device activated.
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6680] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 16 16:15:50 np0005621130.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6776] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal sudo[7672]: pam_unix(sudo:session): session closed for user root
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6825] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6829] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6836] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6842] device (eth0): Activation: successful, device activated.
Feb 16 16:15:50 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258550.6856] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 16 16:15:50 np0005621130.novalocal python3[7758]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-8abf-3e9c-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:16:00 np0005621130.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 16:16:20 np0005621130.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4182] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 16:16:35 np0005621130.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 16:16:35 np0005621130.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4477] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4479] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4487] device (eth1): Activation: successful, device activated.
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4495] manager: startup complete
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4497] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <warn>  [1771258595.4504] device (eth1): Activation: failed for connection 'Wired connection 1'
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4514] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Feb 16 16:16:35 np0005621130.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4741] dhcp4 (eth1): canceled DHCP transaction
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4741] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4741] dhcp4 (eth1): state changed no lease
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4763] policy: auto-activating connection 'ci-private-network' (4afdf514-3b13-5696-9896-ab4bb68602bc)
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4771] device (eth1): Activation: starting connection 'ci-private-network' (4afdf514-3b13-5696-9896-ab4bb68602bc)
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4773] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4779] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4791] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4804] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4873] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4877] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 16:16:35 np0005621130.novalocal NetworkManager[7686]: <info>  [1771258595.4887] device (eth1): Activation: successful, device activated.
Feb 16 16:16:36 np0005621130.novalocal systemd[4802]: Starting Mark boot as successful...
Feb 16 16:16:36 np0005621130.novalocal systemd[4802]: Finished Mark boot as successful.
Feb 16 16:16:45 np0005621130.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 16:16:51 np0005621130.novalocal sshd-session[4811]: Received disconnect from 38.102.83.114 port 60220:11: disconnected by user
Feb 16 16:16:51 np0005621130.novalocal sshd-session[4811]: Disconnected from user zuul 38.102.83.114 port 60220
Feb 16 16:16:51 np0005621130.novalocal sshd-session[4798]: pam_unix(sshd:session): session closed for user zuul
Feb 16 16:16:51 np0005621130.novalocal systemd-logind[821]: Session 1 logged out. Waiting for processes to exit.
Feb 16 16:17:21 np0005621130.novalocal sshd-session[7787]: Accepted publickey for zuul from 38.102.83.114 port 34904 ssh2: RSA SHA256:ihfnzQ/yqoljho9l5byE5LF6hkoYfBrxpcsfdSjUwnI
Feb 16 16:17:21 np0005621130.novalocal systemd-logind[821]: New session 3 of user zuul.
Feb 16 16:17:21 np0005621130.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 16 16:17:21 np0005621130.novalocal sshd-session[7787]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 16:17:21 np0005621130.novalocal sudo[7866]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khdlifdwejjugbegxnsnwwwlicilkpox ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 16:17:21 np0005621130.novalocal sudo[7866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:17:21 np0005621130.novalocal python3[7868]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:17:21 np0005621130.novalocal sudo[7866]: pam_unix(sudo:session): session closed for user root
Feb 16 16:17:22 np0005621130.novalocal sudo[7939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tehbofapncjcipcnxnvhmfhkeniqhouz ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 16 16:17:22 np0005621130.novalocal sudo[7939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:17:22 np0005621130.novalocal python3[7941]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771258641.6129322-312-184127433420686/source _original_basename=tmp0c02w3gk follow=False checksum=8a8217e3983897b9f6bbede232bc1998b72a3cb4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:17:22 np0005621130.novalocal sudo[7939]: pam_unix(sudo:session): session closed for user root
Feb 16 16:17:25 np0005621130.novalocal sshd-session[7790]: Connection closed by 38.102.83.114 port 34904
Feb 16 16:17:25 np0005621130.novalocal sshd-session[7787]: pam_unix(sshd:session): session closed for user zuul
Feb 16 16:17:25 np0005621130.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 16 16:17:25 np0005621130.novalocal systemd-logind[821]: Session 3 logged out. Waiting for processes to exit.
Feb 16 16:17:25 np0005621130.novalocal systemd-logind[821]: Removed session 3.
Feb 16 16:19:36 np0005621130.novalocal systemd[4802]: Created slice User Background Tasks Slice.
Feb 16 16:19:36 np0005621130.novalocal systemd[4802]: Starting Cleanup of User's Temporary Files and Directories...
Feb 16 16:19:36 np0005621130.novalocal systemd[4802]: Finished Cleanup of User's Temporary Files and Directories.
Feb 16 16:28:25 np0005621130.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Feb 16 16:28:25 np0005621130.novalocal sshd-session[7973]: Accepted publickey for zuul from 38.102.83.114 port 45708 ssh2: RSA SHA256:ihfnzQ/yqoljho9l5byE5LF6hkoYfBrxpcsfdSjUwnI
Feb 16 16:28:25 np0005621130.novalocal systemd-logind[821]: New session 4 of user zuul.
Feb 16 16:28:25 np0005621130.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 16 16:28:25 np0005621130.novalocal sshd-session[7973]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 16:28:25 np0005621130.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 16 16:28:25 np0005621130.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Feb 16 16:28:25 np0005621130.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 16 16:28:25 np0005621130.novalocal sudo[8003]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzsydjgklqsoyzgwqojttvaxhgjxfgmq ; /usr/bin/python3'
Feb 16 16:28:25 np0005621130.novalocal sudo[8003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:25 np0005621130.novalocal python3[8005]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-b6e4-7e9e-000000000cd5-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:28:25 np0005621130.novalocal sudo[8003]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:25 np0005621130.novalocal sudo[8032]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtjnqgsjzxpptzdrpnjiigdlrdmwnetu ; /usr/bin/python3'
Feb 16 16:28:25 np0005621130.novalocal sudo[8032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:25 np0005621130.novalocal python3[8034]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:28:25 np0005621130.novalocal sudo[8032]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:25 np0005621130.novalocal sudo[8058]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrxepmuwstactumgsxeychlzmiymogch ; /usr/bin/python3'
Feb 16 16:28:25 np0005621130.novalocal sudo[8058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:25 np0005621130.novalocal python3[8060]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:28:25 np0005621130.novalocal sudo[8058]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:26 np0005621130.novalocal sudo[8084]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpyskrpsqhguelvugglqwulswdcrrppm ; /usr/bin/python3'
Feb 16 16:28:26 np0005621130.novalocal sudo[8084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:26 np0005621130.novalocal python3[8086]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:28:26 np0005621130.novalocal sudo[8084]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:26 np0005621130.novalocal sudo[8110]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlieuehdvfvxckohiimbysxbeiozsybn ; /usr/bin/python3'
Feb 16 16:28:26 np0005621130.novalocal sudo[8110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:26 np0005621130.novalocal python3[8112]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:28:26 np0005621130.novalocal sudo[8110]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:26 np0005621130.novalocal sudo[8136]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uumwozdnzdajgvkhtyyyfgmzqxbugmoa ; /usr/bin/python3'
Feb 16 16:28:26 np0005621130.novalocal sudo[8136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:26 np0005621130.novalocal python3[8138]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:28:27 np0005621130.novalocal sudo[8136]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:27 np0005621130.novalocal sudo[8214]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajmfbxrjebtsncmuycfhyfkfguwpfjgl ; /usr/bin/python3'
Feb 16 16:28:27 np0005621130.novalocal sudo[8214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:27 np0005621130.novalocal python3[8216]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:28:27 np0005621130.novalocal sudo[8214]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:27 np0005621130.novalocal sudo[8287]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erfxqvwgrbtnoxnybzjfgbkgghnlrvak ; /usr/bin/python3'
Feb 16 16:28:27 np0005621130.novalocal sudo[8287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:27 np0005621130.novalocal python3[8289]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771259307.2104108-365-268552464817565/source _original_basename=tmpm335wzwb follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:28:27 np0005621130.novalocal sudo[8287]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:28 np0005621130.novalocal sudo[8337]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlasalpwxsnjudahhbjsyqtdsfcnhyaf ; /usr/bin/python3'
Feb 16 16:28:28 np0005621130.novalocal sudo[8337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:28 np0005621130.novalocal python3[8339]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 16:28:28 np0005621130.novalocal systemd[1]: Reloading.
Feb 16 16:28:28 np0005621130.novalocal systemd-rc-local-generator[8356]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 16:28:29 np0005621130.novalocal sudo[8337]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:30 np0005621130.novalocal sudo[8399]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpiwokpmyrncwecltynfyrmjvdbnskcm ; /usr/bin/python3'
Feb 16 16:28:30 np0005621130.novalocal sudo[8399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:30 np0005621130.novalocal python3[8401]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 16 16:28:30 np0005621130.novalocal sudo[8399]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:30 np0005621130.novalocal sudo[8425]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unismorutqrcjemxymmptculyenteegt ; /usr/bin/python3'
Feb 16 16:28:30 np0005621130.novalocal sudo[8425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:31 np0005621130.novalocal python3[8427]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:28:31 np0005621130.novalocal sudo[8425]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:31 np0005621130.novalocal sudo[8453]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfzrwpdzrlmjrpblyffqdcujfinbdjyq ; /usr/bin/python3'
Feb 16 16:28:31 np0005621130.novalocal sudo[8453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:31 np0005621130.novalocal python3[8455]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:28:31 np0005621130.novalocal sudo[8453]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:31 np0005621130.novalocal sudo[8481]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykcgoviltnfqcpdvtejvccpamtxqorgs ; /usr/bin/python3'
Feb 16 16:28:31 np0005621130.novalocal sudo[8481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:31 np0005621130.novalocal python3[8483]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:28:31 np0005621130.novalocal sudo[8481]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:31 np0005621130.novalocal sudo[8509]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqnjjrwxqknuiadlcrrudlmmsiajvmez ; /usr/bin/python3'
Feb 16 16:28:31 np0005621130.novalocal sudo[8509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:31 np0005621130.novalocal python3[8511]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:28:31 np0005621130.novalocal sudo[8509]: pam_unix(sudo:session): session closed for user root
Feb 16 16:28:32 np0005621130.novalocal python3[8538]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-b6e4-7e9e-000000000cdc-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:28:33 np0005621130.novalocal python3[8568]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 16 16:28:35 np0005621130.novalocal sshd-session[7978]: Connection closed by 38.102.83.114 port 45708
Feb 16 16:28:35 np0005621130.novalocal sshd-session[7973]: pam_unix(sshd:session): session closed for user zuul
Feb 16 16:28:35 np0005621130.novalocal systemd-logind[821]: Session 4 logged out. Waiting for processes to exit.
Feb 16 16:28:35 np0005621130.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 16 16:28:35 np0005621130.novalocal systemd[1]: session-4.scope: Consumed 4.015s CPU time.
Feb 16 16:28:35 np0005621130.novalocal systemd-logind[821]: Removed session 4.
Feb 16 16:28:37 np0005621130.novalocal sshd-session[8576]: Accepted publickey for zuul from 38.102.83.114 port 51634 ssh2: RSA SHA256:ihfnzQ/yqoljho9l5byE5LF6hkoYfBrxpcsfdSjUwnI
Feb 16 16:28:37 np0005621130.novalocal systemd-logind[821]: New session 5 of user zuul.
Feb 16 16:28:37 np0005621130.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 16 16:28:37 np0005621130.novalocal sshd-session[8576]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 16:28:37 np0005621130.novalocal sudo[8603]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucchkmffblrlhcngteryrsbhulosxouw ; /usr/bin/python3'
Feb 16 16:28:37 np0005621130.novalocal sudo[8603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:28:37 np0005621130.novalocal python3[8605]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 16 16:28:38 np0005621130.novalocal sshd-session[8570]: Connection reset by 198.235.24.143 port 61336 [preauth]
Feb 16 16:28:43 np0005621130.novalocal setsebool[8641]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 16 16:28:43 np0005621130.novalocal setsebool[8641]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 16 16:28:53 np0005621130.novalocal kernel: SELinux:  Converting 385 SID table entries...
Feb 16 16:28:53 np0005621130.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 16:28:53 np0005621130.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 16 16:28:53 np0005621130.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 16:28:53 np0005621130.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 16 16:28:53 np0005621130.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 16:28:53 np0005621130.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 16:28:53 np0005621130.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 16:29:02 np0005621130.novalocal kernel: SELinux:  Converting 388 SID table entries...
Feb 16 16:29:02 np0005621130.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 16:29:02 np0005621130.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 16 16:29:02 np0005621130.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 16:29:02 np0005621130.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 16 16:29:02 np0005621130.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 16:29:02 np0005621130.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 16:29:02 np0005621130.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 16:29:20 np0005621130.novalocal dbus-broker-launch[807]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 16 16:29:20 np0005621130.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 16:29:20 np0005621130.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 16 16:29:20 np0005621130.novalocal systemd[1]: Reloading.
Feb 16 16:29:21 np0005621130.novalocal systemd-rc-local-generator[9426]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 16:29:21 np0005621130.novalocal systemd[1]: Starting dnf makecache...
Feb 16 16:29:21 np0005621130.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 16:29:21 np0005621130.novalocal dnf[9566]: Failed determining last makecache time.
Feb 16 16:29:21 np0005621130.novalocal dnf[9566]: CentOS Stream 9 - BaseOS                         61 kB/s | 7.0 kB     00:00
Feb 16 16:29:21 np0005621130.novalocal dnf[9566]: CentOS Stream 9 - BaseOS                         17 kB/s | 3.9 kB     00:00
Feb 16 16:29:21 np0005621130.novalocal dnf[9566]: Errors during downloading metadata for repository 'baseos':
Feb 16 16:29:21 np0005621130.novalocal dnf[9566]:   - Downloading successful, but checksum doesn't match. Calculated: 2ec71dcf1ef98a2ef3b5be4be2ec2d00d6f242e0baee6a1e667766a97b57ef74cfeb48ee39404b0fa0fd874793cb86e479cdb65acf5c026384ca7918cdbe3a8f(sha512)  Expected: 2c50d5655176d6e9deef7042f8ab73b80c537bcc8cdf725735a4e28b9798aaa39831b4013e6ff968584e76e25edd15d87f1e9290454e6a80bd6d6d95b6130846(sha512)
Feb 16 16:29:21 np0005621130.novalocal dnf[9566]: Error: Failed to download metadata for repo 'baseos': Cannot download repomd.xml: Downloading successful, but checksum doesn't match. Calculated: 2ec71dcf1ef98a2ef3b5be4be2ec2d00d6f242e0baee6a1e667766a97b57ef74cfeb48ee39404b0fa0fd874793cb86e479cdb65acf5c026384ca7918cdbe3a8f(sha512)  Expected: 2c50d5655176d6e9deef7042f8ab73b80c537bcc8cdf725735a4e28b9798aaa39831b4013e6ff968584e76e25edd15d87f1e9290454e6a80bd6d6d95b6130846(sha512)
Feb 16 16:29:21 np0005621130.novalocal systemd[1]: dnf-makecache.service: Main process exited, code=exited, status=1/FAILURE
Feb 16 16:29:21 np0005621130.novalocal systemd[1]: dnf-makecache.service: Failed with result 'exit-code'.
Feb 16 16:29:21 np0005621130.novalocal systemd[1]: Failed to start dnf makecache.
Feb 16 16:29:22 np0005621130.novalocal sudo[8603]: pam_unix(sudo:session): session closed for user root
Feb 16 16:29:28 np0005621130.novalocal python3[15342]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-1c47-7d95-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:29:29 np0005621130.novalocal kernel: evm: overlay not supported
Feb 16 16:29:29 np0005621130.novalocal systemd[4802]: Starting D-Bus User Message Bus...
Feb 16 16:29:29 np0005621130.novalocal dbus-broker-launch[15906]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 16 16:29:29 np0005621130.novalocal dbus-broker-launch[15906]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 16 16:29:29 np0005621130.novalocal systemd[4802]: Started D-Bus User Message Bus.
Feb 16 16:29:29 np0005621130.novalocal dbus-broker-lau[15906]: Ready
Feb 16 16:29:29 np0005621130.novalocal systemd[4802]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 16 16:29:29 np0005621130.novalocal systemd[4802]: Created slice Slice /user.
Feb 16 16:29:29 np0005621130.novalocal systemd[4802]: podman-15830.scope: unit configures an IP firewall, but not running as root.
Feb 16 16:29:29 np0005621130.novalocal systemd[4802]: (This warning is only shown for the first unit using IP firewalling.)
Feb 16 16:29:29 np0005621130.novalocal systemd[4802]: Started podman-15830.scope.
Feb 16 16:29:29 np0005621130.novalocal systemd[4802]: Started podman-pause-7692b243.scope.
Feb 16 16:29:30 np0005621130.novalocal sudo[16796]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzuuzleebjrxhqdqzhseakdzramzscco ; /usr/bin/python3'
Feb 16 16:29:30 np0005621130.novalocal sudo[16796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:29:30 np0005621130.novalocal python3[16808]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.224:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.224:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:29:30 np0005621130.novalocal python3[16808]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Feb 16 16:29:30 np0005621130.novalocal sudo[16796]: pam_unix(sudo:session): session closed for user root
Feb 16 16:29:31 np0005621130.novalocal sshd-session[8579]: Connection closed by 38.102.83.114 port 51634
Feb 16 16:29:31 np0005621130.novalocal sshd-session[8576]: pam_unix(sshd:session): session closed for user zuul
Feb 16 16:29:31 np0005621130.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Feb 16 16:29:31 np0005621130.novalocal systemd[1]: session-5.scope: Consumed 40.169s CPU time.
Feb 16 16:29:31 np0005621130.novalocal systemd-logind[821]: Session 5 logged out. Waiting for processes to exit.
Feb 16 16:29:31 np0005621130.novalocal systemd-logind[821]: Removed session 5.
Feb 16 16:29:35 np0005621130.novalocal sshd-session[19054]: Connection closed by authenticating user root 45.148.10.121 port 48660 [preauth]
Feb 16 16:29:48 np0005621130.novalocal sshd-session[24924]: Unable to negotiate with 38.102.83.9 port 37024: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 16 16:29:48 np0005621130.novalocal sshd-session[24921]: Unable to negotiate with 38.102.83.9 port 37054: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 16 16:29:48 np0005621130.novalocal sshd-session[24927]: Unable to negotiate with 38.102.83.9 port 37040: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 16 16:29:48 np0005621130.novalocal sshd-session[24926]: Connection closed by 38.102.83.9 port 37018 [preauth]
Feb 16 16:29:48 np0005621130.novalocal sshd-session[24929]: Connection closed by 38.102.83.9 port 37022 [preauth]
Feb 16 16:29:52 np0005621130.novalocal sshd-session[26802]: Accepted publickey for zuul from 38.102.83.114 port 51636 ssh2: RSA SHA256:ihfnzQ/yqoljho9l5byE5LF6hkoYfBrxpcsfdSjUwnI
Feb 16 16:29:52 np0005621130.novalocal systemd-logind[821]: New session 6 of user zuul.
Feb 16 16:29:52 np0005621130.novalocal systemd[1]: Started Session 6 of User zuul.
Feb 16 16:29:52 np0005621130.novalocal sshd-session[26802]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 16:29:52 np0005621130.novalocal python3[26898]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ1hbp41kJSyMflP+RStHXQ+s1dUIRetIPm61cW3r4nJb8d3e4lM97lEb2FScuO4gOqNpEtBtI+BtrGH0kJqEbI= zuul@np0005621129.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:29:52 np0005621130.novalocal sudo[27073]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxzncsiptgllszvteycsdfkctuwnpyxr ; /usr/bin/python3'
Feb 16 16:29:52 np0005621130.novalocal sudo[27073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:29:52 np0005621130.novalocal python3[27083]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ1hbp41kJSyMflP+RStHXQ+s1dUIRetIPm61cW3r4nJb8d3e4lM97lEb2FScuO4gOqNpEtBtI+BtrGH0kJqEbI= zuul@np0005621129.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:29:53 np0005621130.novalocal sudo[27073]: pam_unix(sudo:session): session closed for user root
Feb 16 16:29:53 np0005621130.novalocal sudo[27463]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axejiemytrntgzogtxjrchzdubksiaoy ; /usr/bin/python3'
Feb 16 16:29:53 np0005621130.novalocal sudo[27463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:29:53 np0005621130.novalocal python3[27472]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005621130.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 16 16:29:53 np0005621130.novalocal useradd[27562]: new group: name=cloud-admin, GID=1002
Feb 16 16:29:53 np0005621130.novalocal useradd[27562]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Feb 16 16:29:53 np0005621130.novalocal sudo[27463]: pam_unix(sudo:session): session closed for user root
Feb 16 16:29:54 np0005621130.novalocal sudo[27721]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijqcuhqflysnfhuubvpqdarnoqrepomm ; /usr/bin/python3'
Feb 16 16:29:54 np0005621130.novalocal sudo[27721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:29:54 np0005621130.novalocal python3[27732]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJ1hbp41kJSyMflP+RStHXQ+s1dUIRetIPm61cW3r4nJb8d3e4lM97lEb2FScuO4gOqNpEtBtI+BtrGH0kJqEbI= zuul@np0005621129.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 16 16:29:54 np0005621130.novalocal sudo[27721]: pam_unix(sudo:session): session closed for user root
Feb 16 16:29:54 np0005621130.novalocal sudo[27989]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlgzrmgcnrviyxduqgnufavklwqwsqna ; /usr/bin/python3'
Feb 16 16:29:54 np0005621130.novalocal sudo[27989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:29:54 np0005621130.novalocal python3[28000]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:29:54 np0005621130.novalocal sudo[27989]: pam_unix(sudo:session): session closed for user root
Feb 16 16:29:55 np0005621130.novalocal sudo[28271]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhddjeagaqlowwcoqjnpvcaauywdjble ; /usr/bin/python3'
Feb 16 16:29:55 np0005621130.novalocal sudo[28271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:29:55 np0005621130.novalocal python3[28278]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771259394.3809412-151-221296003919202/source _original_basename=tmptkdsbc72 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:29:55 np0005621130.novalocal sudo[28271]: pam_unix(sudo:session): session closed for user root
Feb 16 16:29:55 np0005621130.novalocal sudo[28689]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iazmxnqqujqlftgtluxfwvewmblzegib ; /usr/bin/python3'
Feb 16 16:29:55 np0005621130.novalocal sudo[28689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:29:56 np0005621130.novalocal python3[28704]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Feb 16 16:29:56 np0005621130.novalocal systemd[1]: Starting Hostname Service...
Feb 16 16:29:56 np0005621130.novalocal systemd[1]: Started Hostname Service.
Feb 16 16:29:56 np0005621130.novalocal systemd-hostnamed[28854]: Changed pretty hostname to 'compute-0'
Feb 16 16:29:56 compute-0 systemd-hostnamed[28854]: Hostname set to <compute-0> (static)
Feb 16 16:29:56 compute-0 NetworkManager[7686]: <info>  [1771259396.1663] hostname: static hostname changed from "np0005621130.novalocal" to "compute-0"
Feb 16 16:29:56 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 16:29:56 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 16:29:56 compute-0 sudo[28689]: pam_unix(sudo:session): session closed for user root
Feb 16 16:29:56 compute-0 sshd-session[26837]: Connection closed by 38.102.83.114 port 51636
Feb 16 16:29:56 compute-0 sshd-session[26802]: pam_unix(sshd:session): session closed for user zuul
Feb 16 16:29:56 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Feb 16 16:29:56 compute-0 systemd[1]: session-6.scope: Consumed 2.258s CPU time.
Feb 16 16:29:56 compute-0 systemd-logind[821]: Session 6 logged out. Waiting for processes to exit.
Feb 16 16:29:56 compute-0 systemd-logind[821]: Removed session 6.
Feb 16 16:30:02 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 16:30:02 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 16:30:02 compute-0 systemd[1]: man-db-cache-update.service: Consumed 42.608s CPU time.
Feb 16 16:30:02 compute-0 systemd[1]: run-r32ef691116494e77ae0d06b7b427a931.service: Deactivated successfully.
Feb 16 16:30:06 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 16:30:26 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 16:33:33 compute-0 sshd-session[30559]: Accepted publickey for zuul from 38.102.83.9 port 36334 ssh2: RSA SHA256:ihfnzQ/yqoljho9l5byE5LF6hkoYfBrxpcsfdSjUwnI
Feb 16 16:33:33 compute-0 systemd-logind[821]: New session 7 of user zuul.
Feb 16 16:33:33 compute-0 systemd[1]: Started Session 7 of User zuul.
Feb 16 16:33:33 compute-0 sshd-session[30559]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 16:33:33 compute-0 python3[30635]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:33:34 compute-0 sudo[30749]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zttvndxtktpnpjxiffekultqbmrtojpb ; /usr/bin/python3'
Feb 16 16:33:34 compute-0 sudo[30749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:35 compute-0 python3[30751]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:33:35 compute-0 sudo[30749]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:35 compute-0 sudo[30822]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fffxlwugdhjmrfwifumxhewjapvgvxlx ; /usr/bin/python3'
Feb 16 16:33:35 compute-0 sudo[30822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:35 compute-0 python3[30824]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771259614.842074-34344-262456378719996/source mode=0755 _original_basename=delorean.repo follow=False checksum=cc4ab4695da8ec58c451521a3dd2f41014af145d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:33:35 compute-0 sudo[30822]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:35 compute-0 sudo[30848]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpsszchuohvhepsjlvcsetxnyxhqbvrp ; /usr/bin/python3'
Feb 16 16:33:35 compute-0 sudo[30848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:35 compute-0 python3[30850]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:33:35 compute-0 sudo[30848]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:36 compute-0 sudo[30921]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-libxlzsqobskdtkhplsyjrpkcjtqnswo ; /usr/bin/python3'
Feb 16 16:33:36 compute-0 sudo[30921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:36 compute-0 python3[30923]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771259614.842074-34344-262456378719996/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:33:36 compute-0 sudo[30921]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:36 compute-0 sudo[30947]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfzyidpvoudqmkvajbtkadhnkzxxqhzj ; /usr/bin/python3'
Feb 16 16:33:36 compute-0 sudo[30947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:36 compute-0 python3[30949]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:33:36 compute-0 sudo[30947]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:36 compute-0 sudo[31020]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spenlofidcfxbosarbnkpaairkvccmnf ; /usr/bin/python3'
Feb 16 16:33:36 compute-0 sudo[31020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:37 compute-0 python3[31022]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771259614.842074-34344-262456378719996/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:33:37 compute-0 sudo[31020]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:37 compute-0 sudo[31046]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpkwveovyhfguehgilhymhulgcspnxvw ; /usr/bin/python3'
Feb 16 16:33:37 compute-0 sudo[31046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:37 compute-0 python3[31048]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:33:37 compute-0 sudo[31046]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:37 compute-0 sudo[31119]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckqndeornqdnlcwzhqrkkzpkfbnnwkzj ; /usr/bin/python3'
Feb 16 16:33:37 compute-0 sudo[31119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:37 compute-0 python3[31121]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771259614.842074-34344-262456378719996/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:33:37 compute-0 sudo[31119]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:37 compute-0 sudo[31145]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqigobpkmupnsbskqaweleytprvzfxdc ; /usr/bin/python3'
Feb 16 16:33:37 compute-0 sudo[31145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:37 compute-0 python3[31147]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:33:37 compute-0 sudo[31145]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:37 compute-0 sudo[31218]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzejdwliuiqkadxpthoauoohfccebkoy ; /usr/bin/python3'
Feb 16 16:33:37 compute-0 sudo[31218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:38 compute-0 python3[31220]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771259614.842074-34344-262456378719996/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:33:38 compute-0 sudo[31218]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:38 compute-0 sudo[31244]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfzdtunrefazrfhpdozfnzaloefpwude ; /usr/bin/python3'
Feb 16 16:33:38 compute-0 sudo[31244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:38 compute-0 python3[31246]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:33:38 compute-0 sudo[31244]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:38 compute-0 sudo[31317]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmagvnfrfzobcgcizfazkeeenugodfzi ; /usr/bin/python3'
Feb 16 16:33:38 compute-0 sudo[31317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:38 compute-0 python3[31319]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771259614.842074-34344-262456378719996/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:33:38 compute-0 sudo[31317]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:38 compute-0 sudo[31343]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebrfczbgqdjlblqxmqwpxbhffhwycgos ; /usr/bin/python3'
Feb 16 16:33:38 compute-0 sudo[31343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:38 compute-0 python3[31345]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 16 16:33:38 compute-0 sudo[31343]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:39 compute-0 sudo[31416]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixgscyqlpzqqqejdyhaxbyxnelcevsyj ; /usr/bin/python3'
Feb 16 16:33:39 compute-0 sudo[31416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:33:39 compute-0 python3[31418]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1771259614.842074-34344-262456378719996/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=362a603578148d54e8cd25942b88d7f471cc677a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:33:39 compute-0 sudo[31416]: pam_unix(sudo:session): session closed for user root
Feb 16 16:33:41 compute-0 sshd-session[31443]: Unable to negotiate with 192.168.122.11 port 46092: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 16 16:33:41 compute-0 sshd-session[31444]: Connection closed by 192.168.122.11 port 46080 [preauth]
Feb 16 16:33:41 compute-0 sshd-session[31445]: Connection closed by 192.168.122.11 port 46086 [preauth]
Feb 16 16:33:41 compute-0 sshd-session[31446]: Unable to negotiate with 192.168.122.11 port 46094: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 16 16:33:41 compute-0 sshd-session[31447]: Unable to negotiate with 192.168.122.11 port 46100: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 16 16:37:11 compute-0 python3[31479]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:39:38 compute-0 sshd-session[31481]: Connection closed by 27.190.15.128 port 39960
Feb 16 16:42:10 compute-0 sshd-session[30562]: Received disconnect from 38.102.83.9 port 36334:11: disconnected by user
Feb 16 16:42:10 compute-0 sshd-session[30562]: Disconnected from user zuul 38.102.83.9 port 36334
Feb 16 16:42:10 compute-0 sshd-session[30559]: pam_unix(sshd:session): session closed for user zuul
Feb 16 16:42:10 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Feb 16 16:42:10 compute-0 systemd[1]: session-7.scope: Consumed 4.805s CPU time.
Feb 16 16:42:10 compute-0 systemd-logind[821]: Session 7 logged out. Waiting for processes to exit.
Feb 16 16:42:10 compute-0 systemd-logind[821]: Removed session 7.
Feb 16 16:57:22 compute-0 sshd-session[31489]: Accepted publickey for zuul from 192.168.122.30 port 36360 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 16:57:23 compute-0 systemd-logind[821]: New session 8 of user zuul.
Feb 16 16:57:23 compute-0 systemd[1]: Started Session 8 of User zuul.
Feb 16 16:57:23 compute-0 sshd-session[31489]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 16:57:24 compute-0 python3.9[31642]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:57:25 compute-0 sudo[31821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwjzdcjtogzvnhllsarrkjhonruaagcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261044.5990748-44-258347602860188/AnsiballZ_command.py'
Feb 16 16:57:25 compute-0 sudo[31821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:57:25 compute-0 python3.9[31823]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:57:31 compute-0 sudo[31821]: pam_unix(sudo:session): session closed for user root
Feb 16 16:57:32 compute-0 sshd-session[31492]: Connection closed by 192.168.122.30 port 36360
Feb 16 16:57:32 compute-0 sshd-session[31489]: pam_unix(sshd:session): session closed for user zuul
Feb 16 16:57:32 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Feb 16 16:57:32 compute-0 systemd[1]: session-8.scope: Consumed 7.617s CPU time.
Feb 16 16:57:32 compute-0 systemd-logind[821]: Session 8 logged out. Waiting for processes to exit.
Feb 16 16:57:32 compute-0 systemd-logind[821]: Removed session 8.
Feb 16 16:57:37 compute-0 sshd-session[31881]: Accepted publickey for zuul from 192.168.122.30 port 49616 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 16:57:37 compute-0 systemd-logind[821]: New session 9 of user zuul.
Feb 16 16:57:38 compute-0 systemd[1]: Started Session 9 of User zuul.
Feb 16 16:57:38 compute-0 sshd-session[31881]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 16:57:38 compute-0 python3.9[32034]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:57:39 compute-0 sshd-session[31884]: Connection closed by 192.168.122.30 port 49616
Feb 16 16:57:39 compute-0 sshd-session[31881]: pam_unix(sshd:session): session closed for user zuul
Feb 16 16:57:39 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Feb 16 16:57:39 compute-0 systemd-logind[821]: Session 9 logged out. Waiting for processes to exit.
Feb 16 16:57:39 compute-0 systemd-logind[821]: Removed session 9.
Feb 16 16:57:54 compute-0 sshd-session[32062]: Accepted publickey for zuul from 192.168.122.30 port 50280 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 16:57:54 compute-0 systemd-logind[821]: New session 10 of user zuul.
Feb 16 16:57:55 compute-0 systemd[1]: Started Session 10 of User zuul.
Feb 16 16:57:55 compute-0 sshd-session[32062]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 16:57:55 compute-0 python3.9[32215]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 16 16:57:56 compute-0 python3.9[32389]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:57:57 compute-0 sudo[32539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eauybxmybpoyzeafncuofjjnshydjgpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261077.1830282-68-68075975506576/AnsiballZ_command.py'
Feb 16 16:57:57 compute-0 sudo[32539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:57:57 compute-0 python3.9[32541]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 16:57:57 compute-0 sudo[32539]: pam_unix(sudo:session): session closed for user root
Feb 16 16:57:58 compute-0 sudo[32692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vohaemfluuokkfxjwfkfdghfrsrmjeic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261078.2998753-92-127443814049792/AnsiballZ_stat.py'
Feb 16 16:57:58 compute-0 sudo[32692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:57:58 compute-0 python3.9[32694]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 16:57:58 compute-0 sudo[32692]: pam_unix(sudo:session): session closed for user root
Feb 16 16:57:59 compute-0 sudo[32844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igptsniwsogwnijwbodwisgipmuynmmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261079.0100758-108-236292386405640/AnsiballZ_file.py'
Feb 16 16:57:59 compute-0 sudo[32844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:57:59 compute-0 python3.9[32846]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:57:59 compute-0 sudo[32844]: pam_unix(sudo:session): session closed for user root
Feb 16 16:58:00 compute-0 sudo[32996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdasjkpezocgakblepmndejvtfmyfety ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261079.7842205-124-210329986452145/AnsiballZ_stat.py'
Feb 16 16:58:00 compute-0 sudo[32996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:58:00 compute-0 python3.9[32998]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 16:58:00 compute-0 sudo[32996]: pam_unix(sudo:session): session closed for user root
Feb 16 16:58:00 compute-0 sudo[33119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldvgwqnkqcfaydykcucaidijrqwueubv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261079.7842205-124-210329986452145/AnsiballZ_copy.py'
Feb 16 16:58:00 compute-0 sudo[33119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:58:00 compute-0 python3.9[33121]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261079.7842205-124-210329986452145/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:58:00 compute-0 sudo[33119]: pam_unix(sudo:session): session closed for user root
Feb 16 16:58:01 compute-0 sudo[33271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkwelgayxjivgdstbqvahvjpvhbzhxze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261081.0700645-154-39265482158137/AnsiballZ_setup.py'
Feb 16 16:58:01 compute-0 sudo[33271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:58:01 compute-0 python3.9[33273]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:58:01 compute-0 sudo[33271]: pam_unix(sudo:session): session closed for user root
Feb 16 16:58:02 compute-0 sudo[33427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixdwvnlvihfvnustyzlvnlkfftcpwtmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261082.0056934-170-117898651496497/AnsiballZ_file.py'
Feb 16 16:58:02 compute-0 sudo[33427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:58:02 compute-0 python3.9[33429]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 16:58:02 compute-0 sudo[33427]: pam_unix(sudo:session): session closed for user root
Feb 16 16:58:02 compute-0 sudo[33579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynxahcvqrxnvevdipjedosplkuhfzfrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261082.6888077-188-216501401005197/AnsiballZ_file.py'
Feb 16 16:58:02 compute-0 sudo[33579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:58:03 compute-0 python3.9[33581]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 16:58:03 compute-0 sudo[33579]: pam_unix(sudo:session): session closed for user root
Feb 16 16:58:03 compute-0 python3.9[33731]: ansible-ansible.builtin.service_facts Invoked
Feb 16 16:58:06 compute-0 python3.9[33985]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 16:58:07 compute-0 python3.9[34135]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:58:09 compute-0 python3.9[34289]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 16:58:09 compute-0 sudo[34445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qovgprhxjoxuqpouyualfhsmubvwsrai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261089.485223-284-130139189767288/AnsiballZ_setup.py'
Feb 16 16:58:09 compute-0 sudo[34445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:58:10 compute-0 python3.9[34447]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 16:58:10 compute-0 sudo[34445]: pam_unix(sudo:session): session closed for user root
Feb 16 16:58:10 compute-0 irqbalance[819]: Cannot change IRQ 27 affinity: Operation not permitted
Feb 16 16:58:10 compute-0 irqbalance[819]: IRQ 27 affinity is now unmanaged
Feb 16 16:58:10 compute-0 sudo[34529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tezlknyhsvbjxeyeupjinxiotwiefcrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261089.485223-284-130139189767288/AnsiballZ_dnf.py'
Feb 16 16:58:10 compute-0 sudo[34529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 16:58:10 compute-0 python3.9[34531]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 16:59:36 compute-0 systemd[1]: Reloading.
Feb 16 16:59:36 compute-0 systemd-rc-local-generator[34913]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 16:59:37 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 16 16:59:37 compute-0 systemd[1]: Reloading.
Feb 16 16:59:37 compute-0 systemd-rc-local-generator[34962]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 16:59:37 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 16 16:59:37 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 16 16:59:37 compute-0 systemd[1]: Reloading.
Feb 16 16:59:37 compute-0 systemd-rc-local-generator[35006]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 16:59:37 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Feb 16 16:59:38 compute-0 dbus-broker-launch[797]: Noticed file-system modification, trigger reload.
Feb 16 16:59:38 compute-0 dbus-broker-launch[797]: Noticed file-system modification, trigger reload.
Feb 16 16:59:38 compute-0 dbus-broker-launch[797]: Noticed file-system modification, trigger reload.
Feb 16 17:00:33 compute-0 kernel: SELinux:  Converting 2727 SID table entries...
Feb 16 17:00:33 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 17:00:33 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 17:00:33 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 17:00:33 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 17:00:33 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 17:00:33 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 17:00:33 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 17:00:33 compute-0 dbus-broker-launch[807]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 16 17:00:33 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 17:00:33 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 17:00:33 compute-0 systemd[1]: Reloading.
Feb 16 17:00:33 compute-0 systemd-rc-local-generator[35333]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:00:34 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 17:00:34 compute-0 sudo[34529]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:34 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 17:00:34 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 17:00:34 compute-0 systemd[1]: run-rb1ebabd19612493b83279f52f60f567a.service: Deactivated successfully.
Feb 16 17:00:34 compute-0 sudo[36255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbbtgudbuvcurnmjyiojwkzrcybydrji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261234.5997958-308-162981299819372/AnsiballZ_command.py'
Feb 16 17:00:34 compute-0 sudo[36255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:35 compute-0 python3.9[36257]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:00:36 compute-0 sudo[36255]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:37 compute-0 sudo[36536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hidvvkiwnnsydlmzzeiwccsbbtdlxblx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261236.7152832-324-251535825908241/AnsiballZ_selinux.py'
Feb 16 17:00:37 compute-0 sudo[36536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:37 compute-0 python3.9[36538]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 16 17:00:37 compute-0 sudo[36536]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:38 compute-0 sudo[36688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzgctuirdviqdewroujbhuijxccseghw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261238.1454923-346-118817781082705/AnsiballZ_command.py'
Feb 16 17:00:38 compute-0 sudo[36688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:38 compute-0 python3.9[36690]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 16 17:00:39 compute-0 sudo[36688]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:39 compute-0 sudo[36841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djgbclhpyvtlhdicffralpiuthyohnzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261239.5224538-362-45958336150401/AnsiballZ_file.py'
Feb 16 17:00:39 compute-0 sudo[36841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:40 compute-0 python3.9[36843]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:00:40 compute-0 sudo[36841]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:41 compute-0 sudo[36993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuyuwxmomrnkhrxtfjuhkcjeqasrykdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261241.188652-378-211410767207349/AnsiballZ_mount.py'
Feb 16 17:00:41 compute-0 sudo[36993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:41 compute-0 python3.9[36995]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 16 17:00:41 compute-0 sudo[36993]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:41 compute-0 rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 17:00:41 compute-0 rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 17:00:43 compute-0 sudo[37146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifxoxemnqagoaenmbxhlceytckofzngg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261243.2843816-434-185375846377604/AnsiballZ_file.py'
Feb 16 17:00:43 compute-0 sudo[37146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:43 compute-0 python3.9[37148]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:00:43 compute-0 sudo[37146]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:44 compute-0 sudo[37298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buestocugqzphbsmvjxtxcbnclhqyytb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261243.9031456-450-116214015720172/AnsiballZ_stat.py'
Feb 16 17:00:44 compute-0 sudo[37298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:44 compute-0 python3.9[37300]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:00:44 compute-0 sudo[37298]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:44 compute-0 sudo[37421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgrynvxwyanmqnpzcsuaxmusspjhpfxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261243.9031456-450-116214015720172/AnsiballZ_copy.py'
Feb 16 17:00:44 compute-0 sudo[37421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:48 compute-0 python3.9[37423]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261243.9031456-450-116214015720172/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2666b00e89898ecfd58a5d594369d5356783239e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:00:48 compute-0 sudo[37421]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:49 compute-0 sudo[37574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvvrefgeqqlkfmeaxkcnnxwwamlofqao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261249.0628185-498-240411689176810/AnsiballZ_stat.py'
Feb 16 17:00:49 compute-0 sudo[37574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:49 compute-0 python3.9[37576]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:00:49 compute-0 sudo[37574]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:49 compute-0 sudo[37726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbfqtnmonyqvqdnzbaeykqxklhihtzwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261249.6112146-514-105054166396436/AnsiballZ_command.py'
Feb 16 17:00:49 compute-0 sudo[37726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:50 compute-0 python3.9[37728]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:00:50 compute-0 sudo[37726]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:50 compute-0 sudo[37879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfasmiolfywaxczagfdyfnbcfkqbyazv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261250.307628-530-252669382444400/AnsiballZ_file.py'
Feb 16 17:00:50 compute-0 sudo[37879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:50 compute-0 python3.9[37881]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:00:50 compute-0 sudo[37879]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:51 compute-0 sudo[38031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slghjwoiyektjxunlviftslujbvmwykw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261251.1125357-552-23949900141176/AnsiballZ_getent.py'
Feb 16 17:00:51 compute-0 sudo[38031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:51 compute-0 python3.9[38033]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 16 17:00:51 compute-0 sudo[38031]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:52 compute-0 sudo[38184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fomnhpyerbjqompcncnmlpitxqpwygfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261251.875637-568-219076000145092/AnsiballZ_group.py'
Feb 16 17:00:52 compute-0 sudo[38184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:52 compute-0 python3.9[38186]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 17:00:52 compute-0 groupadd[38187]: group added to /etc/group: name=qemu, GID=107
Feb 16 17:00:52 compute-0 groupadd[38187]: group added to /etc/gshadow: name=qemu
Feb 16 17:00:52 compute-0 groupadd[38187]: new group: name=qemu, GID=107
Feb 16 17:00:52 compute-0 sudo[38184]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:53 compute-0 sudo[38342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dauecsblwnjqdnjnwjdyphipspiijjkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261252.710341-584-84277327851551/AnsiballZ_user.py'
Feb 16 17:00:53 compute-0 sudo[38342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:53 compute-0 python3.9[38344]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 17:00:53 compute-0 useradd[38346]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 17:00:53 compute-0 sudo[38342]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:53 compute-0 sudo[38502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dggaejukdtxtfxwhtqddoeplvvkcnmwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261253.6831117-600-279156361113084/AnsiballZ_getent.py'
Feb 16 17:00:53 compute-0 sudo[38502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:54 compute-0 python3.9[38504]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 16 17:00:54 compute-0 sudo[38502]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:54 compute-0 sudo[38655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itatinbisochreqqzteqjyfpzprqymye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261254.3263772-616-196056884482618/AnsiballZ_group.py'
Feb 16 17:00:54 compute-0 sudo[38655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:54 compute-0 python3.9[38657]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 17:00:54 compute-0 groupadd[38658]: group added to /etc/group: name=hugetlbfs, GID=42477
Feb 16 17:00:54 compute-0 groupadd[38658]: group added to /etc/gshadow: name=hugetlbfs
Feb 16 17:00:54 compute-0 groupadd[38658]: new group: name=hugetlbfs, GID=42477
Feb 16 17:00:54 compute-0 sudo[38655]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:55 compute-0 sudo[38813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyxjyekhyfhpcvyfbxoxaanbpzkccrtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261255.0214207-634-17940295571170/AnsiballZ_file.py'
Feb 16 17:00:55 compute-0 sudo[38813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:55 compute-0 python3.9[38815]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 16 17:00:55 compute-0 sudo[38813]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:56 compute-0 sudo[38965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgehevdijhfuxvolsavedlxpyjkmhiek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261255.888642-656-46272263374485/AnsiballZ_dnf.py'
Feb 16 17:00:56 compute-0 sudo[38965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:56 compute-0 python3.9[38967]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:00:57 compute-0 sudo[38965]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:58 compute-0 sudo[39119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjdnrhzbwnrldlgpivjlxjwocolddmsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261258.1820297-672-218278323608430/AnsiballZ_file.py'
Feb 16 17:00:58 compute-0 sudo[39119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:58 compute-0 python3.9[39121]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:00:58 compute-0 sudo[39119]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:59 compute-0 sudo[39271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykubvuyfqwhrqamnyoqjkrbvcryauneq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261258.81506-688-215244335861024/AnsiballZ_stat.py'
Feb 16 17:00:59 compute-0 sudo[39271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:59 compute-0 python3.9[39273]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:00:59 compute-0 sudo[39271]: pam_unix(sudo:session): session closed for user root
Feb 16 17:00:59 compute-0 sudo[39394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvvoaieogfdmpkixnpksxpehsroeqnny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261258.81506-688-215244335861024/AnsiballZ_copy.py'
Feb 16 17:00:59 compute-0 sudo[39394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:00:59 compute-0 python3.9[39396]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261258.81506-688-215244335861024/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:00:59 compute-0 sudo[39394]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:00 compute-0 sudo[39546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrcgdybcmtwdwykxtnplqvewxmpweqiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261259.9483418-718-226802768614104/AnsiballZ_systemd.py'
Feb 16 17:01:00 compute-0 sudo[39546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:00 compute-0 python3.9[39548]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:01:00 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 16 17:01:00 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 16 17:01:00 compute-0 kernel: Bridge firewalling registered
Feb 16 17:01:00 compute-0 systemd-modules-load[39552]: Inserted module 'br_netfilter'
Feb 16 17:01:00 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 16 17:01:01 compute-0 sudo[39546]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:01 compute-0 sudo[39706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtrvgsqnstlkwmpjkbwjjlgxtgwdbiml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261261.1664772-734-2172252296245/AnsiballZ_stat.py'
Feb 16 17:01:01 compute-0 sudo[39706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:01 compute-0 python3.9[39708]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:01:01 compute-0 sudo[39706]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:01 compute-0 CROND[39733]: (root) CMD (run-parts /etc/cron.hourly)
Feb 16 17:01:01 compute-0 run-parts[39736]: (/etc/cron.hourly) starting 0anacron
Feb 16 17:01:01 compute-0 anacron[39749]: Anacron started on 2026-02-16
Feb 16 17:01:01 compute-0 anacron[39749]: Will run job `cron.daily' in 13 min.
Feb 16 17:01:01 compute-0 anacron[39749]: Will run job `cron.weekly' in 33 min.
Feb 16 17:01:01 compute-0 anacron[39749]: Will run job `cron.monthly' in 53 min.
Feb 16 17:01:01 compute-0 anacron[39749]: Jobs will be executed sequentially
Feb 16 17:01:01 compute-0 run-parts[39752]: (/etc/cron.hourly) finished 0anacron
Feb 16 17:01:01 compute-0 CROND[39730]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 16 17:01:01 compute-0 sudo[39844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqhxnysvwptortuazywwapwayajppjcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261261.1664772-734-2172252296245/AnsiballZ_copy.py'
Feb 16 17:01:01 compute-0 sudo[39844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:02 compute-0 python3.9[39846]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261261.1664772-734-2172252296245/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:01:02 compute-0 sudo[39844]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:02 compute-0 sudo[39996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glpqvxabreupsygrtgqvetdpokeahfpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261262.6205423-770-238089011988142/AnsiballZ_dnf.py'
Feb 16 17:01:02 compute-0 sudo[39996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:03 compute-0 python3.9[39998]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:01:12 compute-0 dbus-broker-launch[797]: Noticed file-system modification, trigger reload.
Feb 16 17:01:13 compute-0 dbus-broker-launch[797]: Noticed file-system modification, trigger reload.
Feb 16 17:01:13 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 17:01:13 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 17:01:13 compute-0 systemd[1]: Reloading.
Feb 16 17:01:13 compute-0 systemd-rc-local-generator[40095]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:01:13 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 17:01:13 compute-0 sudo[39996]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:14 compute-0 python3.9[41780]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:01:15 compute-0 python3.9[43120]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 16 17:01:15 compute-0 python3.9[43993]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:01:15 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 17:01:15 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 17:01:15 compute-0 systemd[1]: man-db-cache-update.service: Consumed 3.243s CPU time.
Feb 16 17:01:15 compute-0 systemd[1]: run-rfb46631d43d24ac794a8ff085e56188e.service: Deactivated successfully.
Feb 16 17:01:16 compute-0 sudo[44278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eambijnymzjqjejqumdhkssumcbcpltf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261276.2848601-848-277701651239934/AnsiballZ_command.py'
Feb 16 17:01:16 compute-0 sudo[44278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:16 compute-0 python3.9[44280]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:01:16 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 16 17:01:17 compute-0 systemd[1]: Starting Authorization Manager...
Feb 16 17:01:17 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 16 17:01:17 compute-0 polkitd[44497]: Started polkitd version 0.117
Feb 16 17:01:17 compute-0 polkitd[44497]: Loading rules from directory /etc/polkit-1/rules.d
Feb 16 17:01:17 compute-0 polkitd[44497]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 16 17:01:17 compute-0 polkitd[44497]: Finished loading, compiling and executing 2 rules
Feb 16 17:01:17 compute-0 systemd[1]: Started Authorization Manager.
Feb 16 17:01:17 compute-0 polkitd[44497]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 16 17:01:17 compute-0 sudo[44278]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:17 compute-0 sudo[44665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qndhsxrgkbfolzcqpqyjrajafvjccuhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261277.6312528-866-1693959286531/AnsiballZ_systemd.py'
Feb 16 17:01:17 compute-0 sudo[44665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:18 compute-0 python3.9[44667]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:01:18 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 16 17:01:18 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Feb 16 17:01:18 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 16 17:01:18 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 16 17:01:18 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 16 17:01:18 compute-0 sudo[44665]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:19 compute-0 python3.9[44829]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 16 17:01:21 compute-0 sudo[44979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thualfyskyoeekrtidbdkrxqxpdylazf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261281.697481-980-91254665915140/AnsiballZ_systemd.py'
Feb 16 17:01:21 compute-0 sudo[44979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:22 compute-0 python3.9[44981]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:01:22 compute-0 systemd[1]: Reloading.
Feb 16 17:01:22 compute-0 systemd-rc-local-generator[45006]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:01:22 compute-0 sudo[44979]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:22 compute-0 sudo[45176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eebskfvdanlvpxkofaeahlnciixosejh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261282.6738126-980-98342447904802/AnsiballZ_systemd.py'
Feb 16 17:01:22 compute-0 sudo[45176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:23 compute-0 python3.9[45178]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:01:23 compute-0 systemd[1]: Reloading.
Feb 16 17:01:23 compute-0 systemd-rc-local-generator[45201]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:01:23 compute-0 sudo[45176]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:24 compute-0 sudo[45371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioevtiqcqkuwruicoodbpspdwbxqlmpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261283.7966912-1012-47740768825682/AnsiballZ_command.py'
Feb 16 17:01:24 compute-0 sudo[45371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:24 compute-0 python3.9[45373]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:01:24 compute-0 sudo[45371]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:25 compute-0 sudo[45524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcqzmavsyeccbphkzqvaqfzousxjwyxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261284.819162-1028-29426570369170/AnsiballZ_command.py'
Feb 16 17:01:25 compute-0 sudo[45524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:25 compute-0 python3.9[45526]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:01:25 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Feb 16 17:01:25 compute-0 sudo[45524]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:25 compute-0 sudo[45677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tklfhbzqyjsxqyhwvmlheasxujcxizgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261285.4481342-1044-185352369990399/AnsiballZ_command.py'
Feb 16 17:01:25 compute-0 sudo[45677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:25 compute-0 python3.9[45679]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:01:27 compute-0 sudo[45677]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:27 compute-0 sudo[45839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcavpwtacunjmwiwnborphufzlqqcttb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261287.5684085-1060-57991803988217/AnsiballZ_command.py'
Feb 16 17:01:27 compute-0 sudo[45839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:27 compute-0 python3.9[45841]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:01:27 compute-0 sudo[45839]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:28 compute-0 sudo[45992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wygscuyxebdencbxisvxvfjjnvishcun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261288.160632-1076-104470823953/AnsiballZ_systemd.py'
Feb 16 17:01:28 compute-0 sudo[45992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:28 compute-0 python3.9[45994]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:01:28 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 16 17:01:28 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Feb 16 17:01:28 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Feb 16 17:01:28 compute-0 systemd[1]: Starting Apply Kernel Variables...
Feb 16 17:01:28 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 16 17:01:28 compute-0 systemd[1]: Finished Apply Kernel Variables.
Feb 16 17:01:28 compute-0 sudo[45992]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:29 compute-0 sshd-session[32065]: Connection closed by 192.168.122.30 port 50280
Feb 16 17:01:29 compute-0 sshd-session[32062]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:01:29 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Feb 16 17:01:29 compute-0 systemd[1]: session-10.scope: Consumed 2min 6.414s CPU time.
Feb 16 17:01:29 compute-0 systemd-logind[821]: Session 10 logged out. Waiting for processes to exit.
Feb 16 17:01:29 compute-0 systemd-logind[821]: Removed session 10.
Feb 16 17:01:34 compute-0 sshd-session[46024]: Accepted publickey for zuul from 192.168.122.30 port 47340 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:01:34 compute-0 systemd-logind[821]: New session 11 of user zuul.
Feb 16 17:01:34 compute-0 systemd[1]: Started Session 11 of User zuul.
Feb 16 17:01:34 compute-0 sshd-session[46024]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:01:35 compute-0 python3.9[46177]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:01:37 compute-0 python3.9[46331]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:01:38 compute-0 sudo[46485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xotcravsiumncnyuqovtiuqdzhsrpxli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261297.7100964-80-249524681361479/AnsiballZ_command.py'
Feb 16 17:01:38 compute-0 sudo[46485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:38 compute-0 python3.9[46487]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:01:38 compute-0 sudo[46485]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:39 compute-0 python3.9[46638]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:01:39 compute-0 sudo[46792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odfiiavutrejnwjgvbewjebkaxlihjyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261299.651645-120-138315302578957/AnsiballZ_setup.py'
Feb 16 17:01:39 compute-0 sudo[46792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:40 compute-0 python3.9[46794]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:01:40 compute-0 sudo[46792]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:40 compute-0 sudo[46876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsoyvktkowljufiqystumzausktgsdtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261299.651645-120-138315302578957/AnsiballZ_dnf.py'
Feb 16 17:01:40 compute-0 sudo[46876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:41 compute-0 python3.9[46878]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:01:42 compute-0 sudo[46876]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:42 compute-0 sudo[47029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkcexateqjjwjxffxabioznzfapjgybm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261302.5536382-144-75888393901808/AnsiballZ_setup.py'
Feb 16 17:01:42 compute-0 sudo[47029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:43 compute-0 python3.9[47031]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:01:43 compute-0 sudo[47029]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:43 compute-0 sudo[47200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-deqyptebejtspzeuewiedzgnducdhlbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261303.441042-166-145563650316079/AnsiballZ_file.py'
Feb 16 17:01:43 compute-0 sudo[47200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:44 compute-0 python3.9[47202]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:01:44 compute-0 sudo[47200]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:44 compute-0 sudo[47352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmdassfrmaqcizepjriuulaheawefszy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261304.239484-182-104917176668530/AnsiballZ_command.py'
Feb 16 17:01:44 compute-0 sudo[47352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:44 compute-0 python3.9[47354]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:01:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck3372156183-merged.mount: Deactivated successfully.
Feb 16 17:01:44 compute-0 podman[47355]: 2026-02-16 17:01:44.695203653 +0000 UTC m=+0.061980351 system refresh
Feb 16 17:01:44 compute-0 sudo[47352]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:45 compute-0 sudo[47515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jffqfwleycmzxksfcotrkgimwhtyeylg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261304.9683685-198-51634270319210/AnsiballZ_stat.py'
Feb 16 17:01:45 compute-0 sudo[47515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:45 compute-0 python3.9[47517]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:01:45 compute-0 sudo[47515]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:45 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:01:45 compute-0 sudo[47638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzvmjpyqmkeypkfiazeecujbxeswhtbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261304.9683685-198-51634270319210/AnsiballZ_copy.py'
Feb 16 17:01:45 compute-0 sudo[47638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:46 compute-0 python3.9[47640]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261304.9683685-198-51634270319210/.source.json follow=False _original_basename=podman_network_config.j2 checksum=db7752b61b651625ee7130b45cb0233451b188a9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:01:46 compute-0 sudo[47638]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:46 compute-0 sudo[47790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibgeqnvnckyvjtcxqmtzvmcilgpgdelw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261306.3992052-228-182308773800008/AnsiballZ_stat.py'
Feb 16 17:01:46 compute-0 sudo[47790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:46 compute-0 python3.9[47792]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:01:46 compute-0 sudo[47790]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:47 compute-0 sudo[47913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jykdvfitnevdfkoxgkmponbvsuyqyavp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261306.3992052-228-182308773800008/AnsiballZ_copy.py'
Feb 16 17:01:47 compute-0 sudo[47913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:47 compute-0 python3.9[47915]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261306.3992052-228-182308773800008/.source.conf follow=False _original_basename=registries.conf.j2 checksum=937bbf009263dfa93b72b20b25de6a241077d8e0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:01:47 compute-0 sudo[47913]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:47 compute-0 sudo[48065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cujibxmgspfolyscknhbfaamuksrpyby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261307.5640612-260-203211012698158/AnsiballZ_ini_file.py'
Feb 16 17:01:47 compute-0 sudo[48065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:48 compute-0 python3.9[48067]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:01:48 compute-0 sudo[48065]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:48 compute-0 sudo[48217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpcvkemfbwzhrgashxbbhvlfwuxobmgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261308.3272266-260-41491054560671/AnsiballZ_ini_file.py'
Feb 16 17:01:48 compute-0 sudo[48217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:48 compute-0 python3.9[48219]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:01:48 compute-0 sudo[48217]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:49 compute-0 sudo[48369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgxtjtzefmnaehjnfnmisafibfqiqsms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261308.8680463-260-149192191042415/AnsiballZ_ini_file.py'
Feb 16 17:01:49 compute-0 sudo[48369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:49 compute-0 python3.9[48371]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:01:49 compute-0 sudo[48369]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:49 compute-0 sudo[48521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hushqgvwbheigjovcuiulepejrpcfqpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261309.3959641-260-151364680907071/AnsiballZ_ini_file.py'
Feb 16 17:01:49 compute-0 sudo[48521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:49 compute-0 python3.9[48523]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:01:49 compute-0 sudo[48521]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:50 compute-0 python3.9[48673]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:01:51 compute-0 sudo[48825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvpdrquzochxocnpjxxqxsuddsxdxchi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261310.940521-340-201692341623932/AnsiballZ_dnf.py'
Feb 16 17:01:51 compute-0 sudo[48825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:51 compute-0 python3.9[48827]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:01:52 compute-0 sudo[48825]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:53 compute-0 sudo[48978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckycbawleyhnsgsvprjtrzsaykggpfrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261312.854291-356-219632236906268/AnsiballZ_dnf.py'
Feb 16 17:01:53 compute-0 sudo[48978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:53 compute-0 python3.9[48980]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:01:57 compute-0 sudo[48978]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:57 compute-0 sudo[49139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuenlhxvvlvlpebwulkjpempdmnbsuxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261317.6154096-376-179763317640243/AnsiballZ_dnf.py'
Feb 16 17:01:57 compute-0 sudo[49139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:01:58 compute-0 python3.9[49141]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:01:59 compute-0 sudo[49139]: pam_unix(sudo:session): session closed for user root
Feb 16 17:01:59 compute-0 sudo[49292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnekivpdaiwrxsacfnmmvzkvbhvegehg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261319.7006257-394-238971339011828/AnsiballZ_dnf.py'
Feb 16 17:01:59 compute-0 sudo[49292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:00 compute-0 python3.9[49294]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:02:01 compute-0 sudo[49292]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:01 compute-0 sudo[49445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywccabqbpfefmqxbypmvhpvkjuasosob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261321.748867-416-197295645014528/AnsiballZ_dnf.py'
Feb 16 17:02:01 compute-0 sudo[49445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:02 compute-0 python3.9[49447]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:02:03 compute-0 sudo[49445]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:04 compute-0 sudo[49601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdzbqzszsqjqfdtbsjxyekixckhunmfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261324.1345122-432-55661252630513/AnsiballZ_dnf.py'
Feb 16 17:02:04 compute-0 sudo[49601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:04 compute-0 python3.9[49603]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:02:06 compute-0 sudo[49601]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:07 compute-0 sudo[49770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvnqbqxdroirbckzrkkjckgeqbovzvmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261327.4872968-450-63753383730780/AnsiballZ_dnf.py'
Feb 16 17:02:07 compute-0 sudo[49770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:07 compute-0 python3.9[49772]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:02:09 compute-0 sudo[49770]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:09 compute-0 sudo[49923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbjpwqltkiayytcnswjptgnvvseypbrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261329.4380476-468-110057794479064/AnsiballZ_dnf.py'
Feb 16 17:02:09 compute-0 sudo[49923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:09 compute-0 python3.9[49925]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:02:36 compute-0 sudo[49923]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:37 compute-0 sudo[50296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxftlknbgzrzffxcwfvvjqnvudlfyqkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261357.214455-486-121650391658706/AnsiballZ_dnf.py'
Feb 16 17:02:37 compute-0 sudo[50296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:37 compute-0 python3.9[50298]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:02:38 compute-0 sudo[50296]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:39 compute-0 sudo[50452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evwyngbxrqmhqrkmfpljrjvuozyaasjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261359.3769841-506-262389827137819/AnsiballZ_dnf.py'
Feb 16 17:02:39 compute-0 sudo[50452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:39 compute-0 python3.9[50454]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:02:47 compute-0 sudo[50452]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:48 compute-0 sudo[50649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dswylfielpczlybwfjcctrtbesqmesse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261368.1855218-528-208337452833244/AnsiballZ_file.py'
Feb 16 17:02:48 compute-0 sudo[50649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:48 compute-0 python3.9[50651]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:02:48 compute-0 sudo[50649]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:49 compute-0 sudo[50824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twlastezfokrkzqaktlbpujnqouadqta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261368.8193593-544-8338723888016/AnsiballZ_stat.py'
Feb 16 17:02:49 compute-0 sudo[50824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:49 compute-0 python3.9[50826]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:02:49 compute-0 sudo[50824]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:49 compute-0 sudo[50947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bycazixcfswlzpjdwxkgvxeqfuqqvynr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261368.8193593-544-8338723888016/AnsiballZ_copy.py'
Feb 16 17:02:49 compute-0 sudo[50947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:49 compute-0 python3.9[50949]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1771261368.8193593-544-8338723888016/.source.json _original_basename=.hj38tv75 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:02:49 compute-0 sudo[50947]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:50 compute-0 sudo[51099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jltqvzhthjfxnorwvjpvarujiofndzjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261370.1620977-580-18618520809769/AnsiballZ_podman_image.py'
Feb 16 17:02:50 compute-0 sudo[51099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:50 compute-0 python3.9[51101]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 17:02:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:02:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat664853678-lower\x2dmapped.mount: Deactivated successfully.
Feb 16 17:02:55 compute-0 podman[51113]: 2026-02-16 17:02:55.452220881 +0000 UTC m=+4.591284278 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 16 17:02:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:02:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:02:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:02:55 compute-0 sudo[51099]: pam_unix(sudo:session): session closed for user root
Feb 16 17:02:56 compute-0 sudo[51406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmydglvjmtkqkcdokarzsroazdyskjea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261375.9831395-602-69659531329192/AnsiballZ_podman_image.py'
Feb 16 17:02:56 compute-0 sudo[51406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:02:56 compute-0 python3.9[51408]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 17:03:03 compute-0 podman[51421]: 2026-02-16 17:03:03.531934919 +0000 UTC m=+7.070155241 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:03:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:03 compute-0 sudo[51406]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:05 compute-0 sudo[51719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzvyrejoppktizytuerhvzxgprihxekl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261384.788737-622-260828173910745/AnsiballZ_podman_image.py'
Feb 16 17:03:05 compute-0 sudo[51719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:05 compute-0 python3.9[51721]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 17:03:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:14 compute-0 podman[51734]: 2026-02-16 17:03:14.980422444 +0000 UTC m=+9.682542708 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 16 17:03:14 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:15 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:15 compute-0 sudo[51719]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:15 compute-0 sudo[52017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xerhepvjqjwrmcriqdetbczrwlqbonsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261395.5324628-644-156399371620551/AnsiballZ_podman_image.py'
Feb 16 17:03:15 compute-0 sudo[52017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:15 compute-0 python3.9[52019]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 17:03:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:18 compute-0 podman[52030]: 2026-02-16 17:03:18.336089032 +0000 UTC m=+2.316885278 image pull be811c7ef606e5fdf21f4bb60e867487043c4ca0ef316c864692549ee6c1c369 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 16 17:03:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:18 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:18 compute-0 sudo[52017]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:18 compute-0 sudo[52282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irvngphunpjeczsbgixegpguuooxddoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261398.6303396-644-33603984567153/AnsiballZ_podman_image.py'
Feb 16 17:03:18 compute-0 sudo[52282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:19 compute-0 python3.9[52284]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 16 17:03:20 compute-0 podman[52297]: 2026-02-16 17:03:20.275742078 +0000 UTC m=+1.162453467 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 16 17:03:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:20 compute-0 sudo[52282]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:03:23 compute-0 sshd-session[46027]: Connection closed by 192.168.122.30 port 47340
Feb 16 17:03:23 compute-0 sshd-session[46024]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:03:23 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Feb 16 17:03:23 compute-0 systemd[1]: session-11.scope: Consumed 1min 37.187s CPU time.
Feb 16 17:03:23 compute-0 systemd-logind[821]: Session 11 logged out. Waiting for processes to exit.
Feb 16 17:03:23 compute-0 systemd-logind[821]: Removed session 11.
Feb 16 17:03:28 compute-0 sshd-session[52440]: Accepted publickey for zuul from 192.168.122.30 port 58116 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:03:28 compute-0 systemd-logind[821]: New session 12 of user zuul.
Feb 16 17:03:28 compute-0 systemd[1]: Started Session 12 of User zuul.
Feb 16 17:03:28 compute-0 sshd-session[52440]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:03:29 compute-0 python3.9[52593]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:03:30 compute-0 sudo[52747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuenvxoisuxxbhuehqrernuaakiplqgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261410.0611444-54-150264881498873/AnsiballZ_getent.py'
Feb 16 17:03:30 compute-0 sudo[52747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:30 compute-0 python3.9[52749]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 16 17:03:30 compute-0 sudo[52747]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:31 compute-0 sudo[52900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qaxprtqleuubgcaktceauhfrsvxizsvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261410.8184304-70-122415559828144/AnsiballZ_group.py'
Feb 16 17:03:31 compute-0 sudo[52900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:31 compute-0 python3.9[52902]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 17:03:31 compute-0 groupadd[52903]: group added to /etc/group: name=openvswitch, GID=42476
Feb 16 17:03:31 compute-0 groupadd[52903]: group added to /etc/gshadow: name=openvswitch
Feb 16 17:03:31 compute-0 groupadd[52903]: new group: name=openvswitch, GID=42476
Feb 16 17:03:31 compute-0 sudo[52900]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:32 compute-0 sudo[53058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpgjqnruqbgtahygqvflarsevuqrhseq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261411.6029117-86-156089445292204/AnsiballZ_user.py'
Feb 16 17:03:32 compute-0 sudo[53058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:32 compute-0 python3.9[53060]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 17:03:32 compute-0 useradd[53062]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 17:03:32 compute-0 useradd[53062]: add 'openvswitch' to group 'hugetlbfs'
Feb 16 17:03:32 compute-0 useradd[53062]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 16 17:03:32 compute-0 sudo[53058]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:33 compute-0 sudo[53218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilbixnmdwqjeglyaipxhxilbwibjbvgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261412.700999-106-230664928068023/AnsiballZ_setup.py'
Feb 16 17:03:33 compute-0 sudo[53218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:33 compute-0 python3.9[53220]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:03:33 compute-0 sudo[53218]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:33 compute-0 sudo[53302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udwewwkewavmkhhmsarpknmxtptnekza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261412.700999-106-230664928068023/AnsiballZ_dnf.py'
Feb 16 17:03:33 compute-0 sudo[53302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:34 compute-0 python3.9[53304]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:03:35 compute-0 sudo[53302]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:36 compute-0 sudo[53464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpgdtbbxtqjeyceoenwpjsygthkzmslp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261415.9482741-134-212937915349635/AnsiballZ_dnf.py'
Feb 16 17:03:36 compute-0 sudo[53464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:36 compute-0 python3.9[53466]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:03:46 compute-0 kernel: SELinux:  Converting 2740 SID table entries...
Feb 16 17:03:46 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 17:03:46 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 17:03:46 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 17:03:46 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 17:03:46 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 17:03:46 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 17:03:46 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 17:03:47 compute-0 groupadd[53489]: group added to /etc/group: name=unbound, GID=994
Feb 16 17:03:47 compute-0 groupadd[53489]: group added to /etc/gshadow: name=unbound
Feb 16 17:03:47 compute-0 groupadd[53489]: new group: name=unbound, GID=994
Feb 16 17:03:47 compute-0 useradd[53496]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Feb 16 17:03:47 compute-0 dbus-broker-launch[807]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 16 17:03:47 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 16 17:03:48 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 17:03:48 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 17:03:48 compute-0 systemd[1]: Reloading.
Feb 16 17:03:48 compute-0 systemd-rc-local-generator[53996]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:03:48 compute-0 systemd-sysv-generator[53999]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:03:48 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 17:03:49 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 17:03:49 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 17:03:49 compute-0 systemd[1]: run-r00af67c3cbd84b228618b2476f7b80fb.service: Deactivated successfully.
Feb 16 17:03:49 compute-0 sudo[53464]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:50 compute-0 sudo[54575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbhvnkznkbxwmexosygxmueqxydmqakx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261429.8879516-150-78147068742699/AnsiballZ_systemd.py'
Feb 16 17:03:50 compute-0 sudo[54575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:50 compute-0 python3.9[54577]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 17:03:50 compute-0 systemd[1]: Reloading.
Feb 16 17:03:50 compute-0 systemd-rc-local-generator[54608]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:03:50 compute-0 systemd-sysv-generator[54611]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:03:51 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Feb 16 17:03:51 compute-0 chown[54625]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 16 17:03:51 compute-0 ovs-ctl[54630]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 16 17:03:51 compute-0 ovs-ctl[54630]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 16 17:03:51 compute-0 ovs-ctl[54630]: Starting ovsdb-server [  OK  ]
Feb 16 17:03:51 compute-0 ovs-vsctl[54679]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 16 17:03:51 compute-0 ovs-vsctl[54699]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"09f26141-c730-49d9-ad1c-7063ea4246fa\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Feb 16 17:03:51 compute-0 ovs-ctl[54630]: Configuring Open vSwitch system IDs [  OK  ]
Feb 16 17:03:51 compute-0 ovs-vsctl[54705]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 16 17:03:51 compute-0 ovs-ctl[54630]: Enabling remote OVSDB managers [  OK  ]
Feb 16 17:03:51 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Feb 16 17:03:51 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 16 17:03:51 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 16 17:03:51 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 16 17:03:51 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Feb 16 17:03:51 compute-0 ovs-ctl[54750]: Inserting openvswitch module [  OK  ]
Feb 16 17:03:51 compute-0 ovs-ctl[54719]: Starting ovs-vswitchd [  OK  ]
Feb 16 17:03:51 compute-0 ovs-vsctl[54767]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 16 17:03:51 compute-0 ovs-ctl[54719]: Enabling remote OVSDB managers [  OK  ]
Feb 16 17:03:51 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 16 17:03:51 compute-0 systemd[1]: Starting Open vSwitch...
Feb 16 17:03:51 compute-0 systemd[1]: Finished Open vSwitch.
Feb 16 17:03:51 compute-0 sudo[54575]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:52 compute-0 python3.9[54919]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:03:53 compute-0 sudo[55069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oakqdsodktyarkmcyjrfvrgdcycxcsed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261432.6672478-188-171969013557007/AnsiballZ_sefcontext.py'
Feb 16 17:03:53 compute-0 sudo[55069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:53 compute-0 python3.9[55071]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 16 17:03:54 compute-0 kernel: SELinux:  Converting 2754 SID table entries...
Feb 16 17:03:54 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 17:03:54 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 17:03:54 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 17:03:54 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 17:03:54 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 17:03:54 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 17:03:54 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 17:03:54 compute-0 sudo[55069]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:55 compute-0 python3.9[55227]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:03:56 compute-0 sudo[55383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcamonyniweoyyucolzebxfgzeypufsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261435.7514393-224-215428349058370/AnsiballZ_dnf.py'
Feb 16 17:03:56 compute-0 dbus-broker-launch[807]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 16 17:03:56 compute-0 sudo[55383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:56 compute-0 python3.9[55385]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:03:57 compute-0 sudo[55383]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:58 compute-0 sudo[55536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xygietyfhkxvmwefdzilcuomkopuctot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261437.7252831-240-121110177683371/AnsiballZ_command.py'
Feb 16 17:03:58 compute-0 sudo[55536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:58 compute-0 python3.9[55538]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:03:58 compute-0 sudo[55536]: pam_unix(sudo:session): session closed for user root
Feb 16 17:03:59 compute-0 sudo[55823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vridspaovmdddlctmttbqdnpojxcjyia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261439.0997572-256-199233530628267/AnsiballZ_file.py'
Feb 16 17:03:59 compute-0 sudo[55823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:03:59 compute-0 python3.9[55825]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 16 17:03:59 compute-0 sudo[55823]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:00 compute-0 python3.9[55975]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:04:01 compute-0 sudo[56127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdhnamjnhvosjwqrrrggnvmepvthmhyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261440.848819-288-121339805589592/AnsiballZ_dnf.py'
Feb 16 17:04:01 compute-0 sudo[56127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:01 compute-0 python3.9[56129]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:04:03 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 17:04:03 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 17:04:03 compute-0 systemd[1]: Reloading.
Feb 16 17:04:03 compute-0 systemd-rc-local-generator[56160]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:04:03 compute-0 systemd-sysv-generator[56166]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:04:03 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 17:04:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 17:04:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 17:04:03 compute-0 systemd[1]: run-r171626cb313a483395cd5278121d4180.service: Deactivated successfully.
Feb 16 17:04:03 compute-0 sudo[56127]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:04 compute-0 sudo[56451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcarsoujaofvfjfifhowjcsreejupriz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261443.8119206-304-121831651843494/AnsiballZ_systemd.py'
Feb 16 17:04:04 compute-0 sudo[56451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:04 compute-0 python3.9[56453]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:04:04 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 16 17:04:04 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Feb 16 17:04:04 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Feb 16 17:04:04 compute-0 NetworkManager[7686]: <info>  [1771261444.4083] caught SIGTERM, shutting down normally.
Feb 16 17:04:04 compute-0 systemd[1]: Stopping Network Manager...
Feb 16 17:04:04 compute-0 NetworkManager[7686]: <info>  [1771261444.4100] dhcp4 (eth0): canceled DHCP transaction
Feb 16 17:04:04 compute-0 NetworkManager[7686]: <info>  [1771261444.4100] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 17:04:04 compute-0 NetworkManager[7686]: <info>  [1771261444.4100] dhcp4 (eth0): state changed no lease
Feb 16 17:04:04 compute-0 NetworkManager[7686]: <info>  [1771261444.4104] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 17:04:04 compute-0 NetworkManager[7686]: <info>  [1771261444.4174] exiting (success)
Feb 16 17:04:04 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 17:04:04 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 17:04:04 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 16 17:04:04 compute-0 systemd[1]: Stopped Network Manager.
Feb 16 17:04:04 compute-0 systemd[1]: NetworkManager.service: Consumed 19.833s CPU time, 4.1M memory peak, read 0B from disk, written 41.5K to disk.
Feb 16 17:04:04 compute-0 systemd[1]: Starting Network Manager...
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.4786] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:b69df90a-35c3-4c3f-8202-6e7c0e72a85a)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.4789] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.4854] manager[0x56489c80f000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 16 17:04:04 compute-0 systemd[1]: Starting Hostname Service...
Feb 16 17:04:04 compute-0 systemd[1]: Started Hostname Service.
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5535] hostname: hostname: using hostnamed
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5536] hostname: static hostname changed from (none) to "compute-0"
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5542] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5546] manager[0x56489c80f000]: rfkill: Wi-Fi hardware radio set enabled
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5546] manager[0x56489c80f000]: rfkill: WWAN hardware radio set enabled
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5566] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5575] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5576] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5577] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5577] manager: Networking is enabled by state file
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5579] settings: Loaded settings plugin: keyfile (internal)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5582] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5610] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5621] dhcp: init: Using DHCP client 'internal'
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5624] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5629] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5634] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5641] device (lo): Activation: starting connection 'lo' (06a79a63-f313-4cf6-b532-81d7b1898ab4)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5646] device (eth0): carrier: link connected
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5650] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5655] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5655] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5662] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5667] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5672] device (eth1): carrier: link connected
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5675] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5679] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (4afdf514-3b13-5696-9896-ab4bb68602bc) (indicated)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5679] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5683] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5689] device (eth1): Activation: starting connection 'ci-private-network' (4afdf514-3b13-5696-9896-ab4bb68602bc)
Feb 16 17:04:04 compute-0 systemd[1]: Started Network Manager.
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5694] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5704] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5706] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5708] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5710] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5712] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5714] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5716] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5736] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5749] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5754] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5767] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5789] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5807] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5812] dhcp4 (eth0): state changed new lease, address=38.102.83.146
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5818] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5831] device (lo): Activation: successful, device activated.
Feb 16 17:04:04 compute-0 systemd[1]: Starting Network Manager Wait Online...
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5849] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5933] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5943] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5945] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5951] manager: NetworkManager state is now CONNECTED_LOCAL
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5955] device (eth1): Activation: successful, device activated.
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5968] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5970] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5983] manager: NetworkManager state is now CONNECTED_SITE
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5986] device (eth0): Activation: successful, device activated.
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.5990] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 16 17:04:04 compute-0 sudo[56451]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:04 compute-0 NetworkManager[56463]: <info>  [1771261444.6004] manager: startup complete
Feb 16 17:04:04 compute-0 systemd[1]: Finished Network Manager Wait Online.
Feb 16 17:04:05 compute-0 sudo[56677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viejmdotbstzhhvkdosrapubpcwntwjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261444.9439447-320-269802503177080/AnsiballZ_dnf.py'
Feb 16 17:04:05 compute-0 sudo[56677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:05 compute-0 python3.9[56679]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:04:09 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 17:04:09 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 17:04:09 compute-0 systemd[1]: Reloading.
Feb 16 17:04:09 compute-0 systemd-rc-local-generator[56730]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:04:09 compute-0 systemd-sysv-generator[56733]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:04:10 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 17:04:10 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 17:04:10 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 17:04:10 compute-0 systemd[1]: run-r857c1798ef4242f29c487bee40e3bb97.service: Deactivated successfully.
Feb 16 17:04:10 compute-0 sudo[56677]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:11 compute-0 sudo[57148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcpbxdozssevcuoppchliqbsioarpzeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261451.4218163-344-6093464060496/AnsiballZ_stat.py'
Feb 16 17:04:11 compute-0 sudo[57148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:11 compute-0 python3.9[57150]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:04:11 compute-0 sudo[57148]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:12 compute-0 sudo[57300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwvpnhdxvdlwpvzbwbvybsdfvfxdjxzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261452.124813-362-93934793422016/AnsiballZ_ini_file.py'
Feb 16 17:04:12 compute-0 sudo[57300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:12 compute-0 python3.9[57302]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:12 compute-0 sudo[57300]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:13 compute-0 sudo[57454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iodzahxxvfyawbgifjcrtftlbeanlplc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261452.9203954-382-118995421801250/AnsiballZ_ini_file.py'
Feb 16 17:04:13 compute-0 sudo[57454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:13 compute-0 python3.9[57456]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:13 compute-0 sudo[57454]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:14 compute-0 sudo[57606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymxbngljdulziqcpsrqlmiuuhrvankok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261453.783279-382-178309568777871/AnsiballZ_ini_file.py'
Feb 16 17:04:14 compute-0 sudo[57606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:14 compute-0 python3.9[57608]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:14 compute-0 sudo[57606]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:14 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 17:04:14 compute-0 sudo[57758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzfbexnsmuqzhyfnyiuheipyxdyqsdok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261454.4459753-412-64766101971680/AnsiballZ_ini_file.py'
Feb 16 17:04:14 compute-0 sudo[57758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:14 compute-0 python3.9[57760]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:14 compute-0 sudo[57758]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:15 compute-0 sudo[57910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrogczrwjmmqspaibtwxfbtrcrdcitvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261455.0789113-412-87141491402679/AnsiballZ_ini_file.py'
Feb 16 17:04:15 compute-0 sudo[57910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:15 compute-0 python3.9[57912]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:15 compute-0 sudo[57910]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:15 compute-0 sudo[58062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omdpdtaxwojdpezwthbihgnfhwbmuvmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261455.685854-442-238096936451954/AnsiballZ_stat.py'
Feb 16 17:04:15 compute-0 sudo[58062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:16 compute-0 python3.9[58064]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:04:16 compute-0 sudo[58062]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:16 compute-0 sudo[58185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oofdtyhedqqzpbowosebsplznwawkdzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261455.685854-442-238096936451954/AnsiballZ_copy.py'
Feb 16 17:04:16 compute-0 sudo[58185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:16 compute-0 python3.9[58187]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261455.685854-442-238096936451954/.source _original_basename=.3u7h2vr5 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:16 compute-0 sudo[58185]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:17 compute-0 sudo[58337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaircrbglomccxfitrjalovqbttrhucl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261457.0457368-472-146777354798962/AnsiballZ_file.py'
Feb 16 17:04:17 compute-0 sudo[58337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:17 compute-0 python3.9[58339]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:17 compute-0 sudo[58337]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:18 compute-0 sudo[58489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jywekryzejbkpwforktlalzlnlhfocqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261457.6120286-488-24116948946301/AnsiballZ_edpm_os_net_config_mappings.py'
Feb 16 17:04:18 compute-0 sudo[58489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:18 compute-0 python3.9[58491]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 16 17:04:18 compute-0 sudo[58489]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:18 compute-0 sudo[58641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvwbcsrdfdecfamlwzaaeyfncunyfhci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261458.461461-506-92899280052613/AnsiballZ_file.py'
Feb 16 17:04:18 compute-0 sudo[58641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:18 compute-0 python3.9[58643]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:19 compute-0 sudo[58641]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:19 compute-0 sudo[58793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obvtivudeakzuwscevpktmwhbtiqldyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261459.2986639-526-253135909046196/AnsiballZ_stat.py'
Feb 16 17:04:19 compute-0 sudo[58793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:19 compute-0 sudo[58793]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:20 compute-0 sudo[58916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beziagjzpsassnjodjgdigenvvpyjntp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261459.2986639-526-253135909046196/AnsiballZ_copy.py'
Feb 16 17:04:20 compute-0 sudo[58916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:20 compute-0 sudo[58916]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:20 compute-0 sudo[59068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vevpzashvalkgolhbtniuygxvxmmeida ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261460.5189772-556-245630842629108/AnsiballZ_slurp.py'
Feb 16 17:04:20 compute-0 sudo[59068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:21 compute-0 python3.9[59070]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 16 17:04:21 compute-0 sudo[59068]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:22 compute-0 sudo[59243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adhdfrtvpnnbmkwnjfgvhbdenbperujn ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261461.6713624-574-160271843522051/async_wrapper.py j210899036004 300 /home/zuul/.ansible/tmp/ansible-tmp-1771261461.6713624-574-160271843522051/AnsiballZ_edpm_os_net_config.py _'
Feb 16 17:04:22 compute-0 sudo[59243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:22 compute-0 ansible-async_wrapper.py[59245]: Invoked with j210899036004 300 /home/zuul/.ansible/tmp/ansible-tmp-1771261461.6713624-574-160271843522051/AnsiballZ_edpm_os_net_config.py _
Feb 16 17:04:22 compute-0 ansible-async_wrapper.py[59248]: Starting module and watcher
Feb 16 17:04:22 compute-0 ansible-async_wrapper.py[59248]: Start watching 59249 (300)
Feb 16 17:04:22 compute-0 ansible-async_wrapper.py[59249]: Start module (59249)
Feb 16 17:04:22 compute-0 ansible-async_wrapper.py[59245]: Return async_wrapper task started.
Feb 16 17:04:22 compute-0 sudo[59243]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:22 compute-0 python3.9[59250]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=True purge_provider=
Feb 16 17:04:23 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 16 17:04:23 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 16 17:04:23 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Feb 16 17:04:23 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 16 17:04:23 compute-0 kernel: cfg80211: failed to load regulatory.db
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.4816] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.4840] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5330] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5333] audit: op="connection-add" uuid="0d716dd9-f093-4a6f-aa95-ef9aaea0f94f" name="br-ex-br" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5346] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5347] audit: op="connection-add" uuid="baca1efc-477c-4f23-b536-f8815893ea2e" name="br-ex-port" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5356] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5358] audit: op="connection-add" uuid="0684c8cc-c57c-4d47-9e73-df9efc425d92" name="eth1-port" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5368] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5370] audit: op="connection-add" uuid="da685b41-44d1-457a-a121-efae500169ed" name="vlan20-port" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5379] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5381] audit: op="connection-add" uuid="d76a21f4-95cc-4a48-ba2c-bcf2f1aa2991" name="vlan21-port" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5392] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5394] audit: op="connection-add" uuid="3d814ae7-59d7-4190-a42e-4132a6a8fe50" name="vlan22-port" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5411] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5425] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5427] audit: op="connection-add" uuid="d5a14438-3f26-4eb2-9fcf-6a0a1bca5b4a" name="br-ex-if" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5472] audit: op="connection-update" uuid="4afdf514-3b13-5696-9896-ab4bb68602bc" name="ci-private-network" args="ipv4.method,ipv4.routes,ipv4.dns,ipv4.addresses,ipv4.routing-rules,ipv4.never-default,ipv6.addr-gen-mode,ipv6.method,ipv6.dns,ipv6.addresses,ipv6.routing-rules,ipv6.routes,ovs-interface.type,connection.controller,connection.slave-type,connection.timestamp,connection.master,connection.port-type,ovs-external-ids.data" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5486] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5488] audit: op="connection-add" uuid="ed40761f-9fe8-4cec-8f34-84950f44de51" name="vlan20-if" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5501] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5503] audit: op="connection-add" uuid="69d89858-bcde-486b-b434-e6462810efac" name="vlan21-if" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5517] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5518] audit: op="connection-add" uuid="900e284d-1782-40b5-ba6e-a03db1ef493c" name="vlan22-if" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5529] audit: op="connection-delete" uuid="5343486a-909f-3711-a4c4-9479ef6f81b7" name="Wired connection 1" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5540] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <warn>  [1771261464.5543] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5548] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5552] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (0d716dd9-f093-4a6f-aa95-ef9aaea0f94f)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5552] audit: op="connection-activate" uuid="0d716dd9-f093-4a6f-aa95-ef9aaea0f94f" name="br-ex-br" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5554] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <warn>  [1771261464.5555] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5560] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5563] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (baca1efc-477c-4f23-b536-f8815893ea2e)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5565] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <warn>  [1771261464.5566] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5569] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5573] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (0684c8cc-c57c-4d47-9e73-df9efc425d92)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5576] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <warn>  [1771261464.5577] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5583] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5588] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (da685b41-44d1-457a-a121-efae500169ed)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5591] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <warn>  [1771261464.5592] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5598] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5603] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (d76a21f4-95cc-4a48-ba2c-bcf2f1aa2991)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5605] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <warn>  [1771261464.5607] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5614] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5618] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (3d814ae7-59d7-4190-a42e-4132a6a8fe50)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5620] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5623] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5626] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5633] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <warn>  [1771261464.5635] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5639] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5644] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (d5a14438-3f26-4eb2-9fcf-6a0a1bca5b4a)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5646] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5650] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5653] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5655] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5656] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5668] device (eth1): disconnecting for new activation request.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5669] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5673] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5675] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5677] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5680] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <warn>  [1771261464.5681] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5685] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5690] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (ed40761f-9fe8-4cec-8f34-84950f44de51)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5692] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5696] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5698] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5701] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5705] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <warn>  [1771261464.5707] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5711] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5715] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (69d89858-bcde-486b-b434-e6462810efac)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5717] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5721] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5723] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5725] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5728] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <warn>  [1771261464.5729] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5733] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5737] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (900e284d-1782-40b5-ba6e-a03db1ef493c)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5739] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5742] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5745] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5746] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5748] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5762] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,802-3-ethernet.mtu,connection.autoconnect-priority" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5764] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5767] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5770] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5780] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5785] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5788] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 kernel: ovs-system: entered promiscuous mode
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5803] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5805] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5810] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5815] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 systemd-udevd[59257]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:04:24 compute-0 kernel: Timeout policy base is empty
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5818] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5821] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5825] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5828] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5830] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5832] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5836] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5840] dhcp4 (eth0): canceled DHCP transaction
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5840] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5840] dhcp4 (eth0): state changed no lease
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5841] dhcp4 (eth0): activation: beginning transaction (no timeout)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5851] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5854] audit: op="device-reapply" interface="eth1" ifindex=3 pid=59251 uid=0 result="fail" reason="Device is not activated"
Feb 16 17:04:24 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5922] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5925] dhcp4 (eth0): state changed new lease, address=38.102.83.146
Feb 16 17:04:24 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5961] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5972] device (eth1): disconnecting for new activation request.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5973] audit: op="connection-activate" uuid="4afdf514-3b13-5696-9896-ab4bb68602bc" name="ci-private-network" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.5973] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 16 17:04:24 compute-0 kernel: br-ex: entered promiscuous mode
Feb 16 17:04:24 compute-0 kernel: vlan22: entered promiscuous mode
Feb 16 17:04:24 compute-0 systemd-udevd[59256]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:04:24 compute-0 kernel: vlan20: entered promiscuous mode
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6120] device (eth1): Activation: starting connection 'ci-private-network' (4afdf514-3b13-5696-9896-ab4bb68602bc)
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6126] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 systemd-udevd[59255]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6133] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6152] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 kernel: vlan21: entered promiscuous mode
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6172] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6179] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6184] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6234] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6236] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6238] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6240] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6242] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6244] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59251 uid=0 result="success"
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6258] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6274] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6288] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6292] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6297] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6301] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6306] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6311] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6317] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6320] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6324] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6329] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6339] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6342] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6346] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6351] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6368] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6372] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6386] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6408] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6413] device (eth1): Activation: successful, device activated.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6418] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6424] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6431] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6445] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6450] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6458] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6464] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6465] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6468] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6472] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6477] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6481] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6487] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6488] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 16 17:04:24 compute-0 NetworkManager[56463]: <info>  [1771261464.6492] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 16 17:04:25 compute-0 NetworkManager[56463]: <info>  [1771261465.7681] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59251 uid=0 result="success"
Feb 16 17:04:25 compute-0 NetworkManager[56463]: <info>  [1771261465.9256] checkpoint[0x56489c7e4950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Feb 16 17:04:25 compute-0 NetworkManager[56463]: <info>  [1771261465.9258] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59251 uid=0 result="success"
Feb 16 17:04:26 compute-0 sudo[59584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icelksfvwltiejsyonteztlgyjaouhvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261465.7318416-574-100969559539620/AnsiballZ_async_status.py'
Feb 16 17:04:26 compute-0 sudo[59584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:26 compute-0 NetworkManager[56463]: <info>  [1771261466.2609] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59251 uid=0 result="success"
Feb 16 17:04:26 compute-0 NetworkManager[56463]: <info>  [1771261466.2619] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59251 uid=0 result="success"
Feb 16 17:04:26 compute-0 python3.9[59586]: ansible-ansible.legacy.async_status Invoked with jid=j210899036004.59245 mode=status _async_dir=/root/.ansible_async
Feb 16 17:04:26 compute-0 sudo[59584]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:26 compute-0 NetworkManager[56463]: <info>  [1771261466.4218] audit: op="networking-control" arg="global-dns-configuration" pid=59251 uid=0 result="success"
Feb 16 17:04:26 compute-0 NetworkManager[56463]: <info>  [1771261466.4240] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Feb 16 17:04:26 compute-0 NetworkManager[56463]: <info>  [1771261466.4263] audit: op="networking-control" arg="global-dns-configuration" pid=59251 uid=0 result="success"
Feb 16 17:04:26 compute-0 NetworkManager[56463]: <info>  [1771261466.4282] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59251 uid=0 result="success"
Feb 16 17:04:26 compute-0 NetworkManager[56463]: <info>  [1771261466.5196] checkpoint[0x56489c7e4a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Feb 16 17:04:26 compute-0 NetworkManager[56463]: <info>  [1771261466.5199] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59251 uid=0 result="success"
Feb 16 17:04:26 compute-0 ansible-async_wrapper.py[59249]: Module complete (59249)
Feb 16 17:04:27 compute-0 ansible-async_wrapper.py[59248]: Done in kid B.
Feb 16 17:04:29 compute-0 sudo[59688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcsqeomxbkwelchpwkxncqgfofyyubbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261465.7318416-574-100969559539620/AnsiballZ_async_status.py'
Feb 16 17:04:29 compute-0 sudo[59688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:29 compute-0 python3.9[59690]: ansible-ansible.legacy.async_status Invoked with jid=j210899036004.59245 mode=status _async_dir=/root/.ansible_async
Feb 16 17:04:29 compute-0 sudo[59688]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:30 compute-0 sudo[59788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzipwnicyovwtloxklcpeyewzphhxfxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261465.7318416-574-100969559539620/AnsiballZ_async_status.py'
Feb 16 17:04:30 compute-0 sudo[59788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:30 compute-0 python3.9[59790]: ansible-ansible.legacy.async_status Invoked with jid=j210899036004.59245 mode=cleanup _async_dir=/root/.ansible_async
Feb 16 17:04:30 compute-0 sudo[59788]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:31 compute-0 sudo[59940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjncrtyissrozupsynyzthbvxxppxbvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261471.1410692-628-138112145796131/AnsiballZ_stat.py'
Feb 16 17:04:31 compute-0 sudo[59940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:31 compute-0 python3.9[59942]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:04:31 compute-0 sudo[59940]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:32 compute-0 sudo[60063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suhtraumhcxqlkvhrzdcmspbbnicvdiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261471.1410692-628-138112145796131/AnsiballZ_copy.py'
Feb 16 17:04:32 compute-0 sudo[60063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:32 compute-0 python3.9[60065]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261471.1410692-628-138112145796131/.source.returncode _original_basename=.sorce8t8 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:32 compute-0 sudo[60063]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:33 compute-0 sudo[60216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyfwhsdcngkyrzpqxsaactijzybobhbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261472.8542655-660-223104038752035/AnsiballZ_stat.py'
Feb 16 17:04:33 compute-0 sudo[60216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:33 compute-0 python3.9[60218]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:04:33 compute-0 sudo[60216]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:33 compute-0 sudo[60339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tseenoxmxsztxtyqvwltxvlnimzvdohy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261472.8542655-660-223104038752035/AnsiballZ_copy.py'
Feb 16 17:04:33 compute-0 sudo[60339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:34 compute-0 python3.9[60341]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261472.8542655-660-223104038752035/.source.cfg _original_basename=.1mc460e3 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:34 compute-0 sudo[60339]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:34 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 17:04:35 compute-0 sudo[60493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfqcknvnylvvpdgcuurelcwmeygqjros ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261474.7315564-690-57616897089396/AnsiballZ_systemd.py'
Feb 16 17:04:35 compute-0 sudo[60493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:35 compute-0 python3.9[60495]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:04:35 compute-0 systemd[1]: Reloading Network Manager...
Feb 16 17:04:35 compute-0 NetworkManager[56463]: <info>  [1771261475.4627] audit: op="reload" arg="0" pid=60499 uid=0 result="success"
Feb 16 17:04:35 compute-0 NetworkManager[56463]: <info>  [1771261475.4638] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Feb 16 17:04:35 compute-0 systemd[1]: Reloaded Network Manager.
Feb 16 17:04:35 compute-0 sudo[60493]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:36 compute-0 sshd-session[52443]: Connection closed by 192.168.122.30 port 58116
Feb 16 17:04:36 compute-0 sshd-session[52440]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:04:36 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Feb 16 17:04:36 compute-0 systemd[1]: session-12.scope: Consumed 45.886s CPU time.
Feb 16 17:04:36 compute-0 systemd-logind[821]: Session 12 logged out. Waiting for processes to exit.
Feb 16 17:04:36 compute-0 systemd-logind[821]: Removed session 12.
Feb 16 17:04:41 compute-0 sshd-session[60530]: Accepted publickey for zuul from 192.168.122.30 port 37638 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:04:41 compute-0 systemd-logind[821]: New session 13 of user zuul.
Feb 16 17:04:41 compute-0 systemd[1]: Started Session 13 of User zuul.
Feb 16 17:04:41 compute-0 sshd-session[60530]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:04:42 compute-0 python3.9[60683]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:04:43 compute-0 python3.9[60838]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:04:44 compute-0 python3.9[61027]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:04:45 compute-0 sshd-session[60533]: Connection closed by 192.168.122.30 port 37638
Feb 16 17:04:45 compute-0 sshd-session[60530]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:04:45 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Feb 16 17:04:45 compute-0 systemd[1]: session-13.scope: Consumed 2.037s CPU time.
Feb 16 17:04:45 compute-0 systemd-logind[821]: Session 13 logged out. Waiting for processes to exit.
Feb 16 17:04:45 compute-0 systemd-logind[821]: Removed session 13.
Feb 16 17:04:45 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 16 17:04:50 compute-0 sshd-session[61056]: Accepted publickey for zuul from 192.168.122.30 port 53710 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:04:50 compute-0 systemd-logind[821]: New session 14 of user zuul.
Feb 16 17:04:50 compute-0 systemd[1]: Started Session 14 of User zuul.
Feb 16 17:04:50 compute-0 sshd-session[61056]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:04:51 compute-0 python3.9[61209]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:04:52 compute-0 python3.9[61364]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:04:53 compute-0 sudo[61518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obklrnfxuqumjwcoefiuqdkxfowzkhfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261492.838528-59-138886746094215/AnsiballZ_setup.py'
Feb 16 17:04:53 compute-0 sudo[61518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:53 compute-0 python3.9[61520]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:04:53 compute-0 sudo[61518]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:54 compute-0 sudo[61602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyjsgcgkojwtlyddrzrfysqiizbbzwzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261492.838528-59-138886746094215/AnsiballZ_dnf.py'
Feb 16 17:04:54 compute-0 sudo[61602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:54 compute-0 python3.9[61604]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:04:55 compute-0 sudo[61602]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:55 compute-0 sudo[61756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnxhwsrvqgewemwfyunmcbrznwkxwydx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261495.7217612-83-17902460942276/AnsiballZ_setup.py'
Feb 16 17:04:55 compute-0 sudo[61756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:56 compute-0 python3.9[61758]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:04:56 compute-0 sudo[61756]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:57 compute-0 sudo[61947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xldmvkyikusarvtiqhjrkyqkpphbongo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261496.7917225-105-177311411727085/AnsiballZ_file.py'
Feb 16 17:04:57 compute-0 sudo[61947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:57 compute-0 python3.9[61949]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:57 compute-0 sudo[61947]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:58 compute-0 sudo[62099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfuljwgdftwbopunaazjkwgmrfbpvtua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261497.6503532-121-159607269834692/AnsiballZ_command.py'
Feb 16 17:04:58 compute-0 sudo[62099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:58 compute-0 python3.9[62101]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:04:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:04:58 compute-0 sudo[62099]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:59 compute-0 sudo[62262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awlzendxklwclcqbmehvwsupoyfzjoxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261498.6938539-137-73283957052450/AnsiballZ_stat.py'
Feb 16 17:04:59 compute-0 sudo[62262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:59 compute-0 python3.9[62264]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:04:59 compute-0 sudo[62262]: pam_unix(sudo:session): session closed for user root
Feb 16 17:04:59 compute-0 sudo[62340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrvdlnfephqrbkitzvlxieshxwcchvqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261498.6938539-137-73283957052450/AnsiballZ_file.py'
Feb 16 17:04:59 compute-0 sudo[62340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:04:59 compute-0 python3.9[62342]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:04:59 compute-0 sudo[62340]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:00 compute-0 sudo[62492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blojejhrohxqrieebcbrzhxkpoouwslq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261499.9941177-161-207211303087758/AnsiballZ_stat.py'
Feb 16 17:05:00 compute-0 sudo[62492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:00 compute-0 python3.9[62494]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:00 compute-0 sudo[62492]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:00 compute-0 sudo[62570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzolbrvdctxtaudavjqhltvkbnweulre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261499.9941177-161-207211303087758/AnsiballZ_file.py'
Feb 16 17:05:00 compute-0 sudo[62570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:00 compute-0 python3.9[62572]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:05:01 compute-0 sudo[62570]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:01 compute-0 sudo[62722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucldjnncddyvjfhvlmkcyijdciymopon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261501.2208195-187-107380080845300/AnsiballZ_ini_file.py'
Feb 16 17:05:01 compute-0 sudo[62722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:02 compute-0 python3.9[62724]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:05:02 compute-0 sudo[62722]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:02 compute-0 sudo[62874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhgshbiqisksmpaohcxeqjnehivegyez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261502.2772534-187-174781055595735/AnsiballZ_ini_file.py'
Feb 16 17:05:02 compute-0 sudo[62874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:02 compute-0 python3.9[62876]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:05:02 compute-0 sudo[62874]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:03 compute-0 sudo[63026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dochhpvyzifnreynksxavekljzmxggre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261502.9831815-187-203029196270448/AnsiballZ_ini_file.py'
Feb 16 17:05:03 compute-0 sudo[63026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:03 compute-0 python3.9[63028]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:05:03 compute-0 sudo[63026]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:03 compute-0 sudo[63178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reruwlzqfgwldpplluemxvkjujmngphi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261503.628101-187-24705416461184/AnsiballZ_ini_file.py'
Feb 16 17:05:03 compute-0 sudo[63178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:04 compute-0 python3.9[63180]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:05:04 compute-0 sudo[63178]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:04 compute-0 sudo[63330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqsboijjtzteyceudspmpyhzwrqswjsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261504.5033026-249-123649545012366/AnsiballZ_dnf.py'
Feb 16 17:05:04 compute-0 sudo[63330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:05 compute-0 python3.9[63332]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:05:06 compute-0 sudo[63330]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:07 compute-0 sudo[63483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sywncsoyfqhuzldbipstkjhrbditoula ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261506.7892494-271-104248068714835/AnsiballZ_setup.py'
Feb 16 17:05:07 compute-0 sudo[63483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:07 compute-0 python3.9[63485]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:05:07 compute-0 sudo[63483]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:08 compute-0 sudo[63637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiayialtphvkppffotkepgiijdgalzxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261508.1081622-287-206862334291653/AnsiballZ_stat.py'
Feb 16 17:05:08 compute-0 sudo[63637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:08 compute-0 python3.9[63639]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:05:08 compute-0 sudo[63637]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:09 compute-0 sudo[63789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuepgnrzwiflkqkbixihsikhobbewqka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261508.796625-305-77680641071838/AnsiballZ_stat.py'
Feb 16 17:05:09 compute-0 sudo[63789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:09 compute-0 python3.9[63791]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:05:09 compute-0 sudo[63789]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:10 compute-0 sudo[63941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfiunwldelmtvzdgpqjjkqcdebphqrnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261509.5892448-325-101011744074922/AnsiballZ_command.py'
Feb 16 17:05:10 compute-0 sudo[63941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:10 compute-0 python3.9[63943]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:05:10 compute-0 sudo[63941]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:11 compute-0 sudo[64094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwtsvwdobwvviwelkqjotrfudnndhxtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261510.6167278-345-211692630461938/AnsiballZ_service_facts.py'
Feb 16 17:05:11 compute-0 sudo[64094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:11 compute-0 python3.9[64096]: ansible-service_facts Invoked
Feb 16 17:05:11 compute-0 network[64113]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 17:05:11 compute-0 network[64114]: 'network-scripts' will be removed from distribution in near future.
Feb 16 17:05:11 compute-0 network[64115]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 17:05:14 compute-0 sudo[64094]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:16 compute-0 sudo[64399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiamlpwwmzimmibujfpgrvtucwjolihn ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1771261515.7661564-375-190557987193085/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1771261515.7661564-375-190557987193085/args'
Feb 16 17:05:16 compute-0 sudo[64399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:16 compute-0 sudo[64399]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:16 compute-0 sudo[64566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iembtlowljskjpobemlwcygsylthbtnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261516.524162-397-248608915016945/AnsiballZ_dnf.py'
Feb 16 17:05:16 compute-0 sudo[64566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:17 compute-0 python3.9[64568]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:05:18 compute-0 sudo[64566]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:19 compute-0 sudo[64719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-matutcldwzlgabucbwqjerkpybpdbwma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261518.781568-423-68858509429780/AnsiballZ_package_facts.py'
Feb 16 17:05:19 compute-0 sudo[64719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:19 compute-0 python3.9[64721]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 16 17:05:20 compute-0 sudo[64719]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:20 compute-0 sudo[64871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jknztspcdqywsuindobovvluhceexxyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261520.6457317-443-175629972333248/AnsiballZ_stat.py'
Feb 16 17:05:20 compute-0 sudo[64871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:21 compute-0 python3.9[64873]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:21 compute-0 sudo[64871]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:21 compute-0 sudo[64996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgjzdprvveapoxhzbtijmgjsqadxyspu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261520.6457317-443-175629972333248/AnsiballZ_copy.py'
Feb 16 17:05:21 compute-0 sudo[64996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:21 compute-0 python3.9[64998]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261520.6457317-443-175629972333248/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:21 compute-0 sudo[64996]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:22 compute-0 sudo[65150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnfiricujccxbgsvaallsqhamlxccfwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261522.2782059-473-258828351154824/AnsiballZ_stat.py'
Feb 16 17:05:22 compute-0 sudo[65150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:22 compute-0 python3.9[65152]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:22 compute-0 sudo[65150]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:23 compute-0 sudo[65275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avxqjaqwiqplulvoazymyrqvwhuilajw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261522.2782059-473-258828351154824/AnsiballZ_copy.py'
Feb 16 17:05:23 compute-0 sudo[65275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:23 compute-0 python3.9[65277]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261522.2782059-473-258828351154824/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:23 compute-0 sudo[65275]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:24 compute-0 sudo[65429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwgeaaolbfxrmwkysupqsgzjvhmfiyus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261524.1578727-515-108883147328234/AnsiballZ_lineinfile.py'
Feb 16 17:05:24 compute-0 sudo[65429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:24 compute-0 python3.9[65431]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:24 compute-0 sudo[65429]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:26 compute-0 sudo[65583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhnksimjzvcwnoswxircmtpppojncxll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261525.55582-545-106216428771508/AnsiballZ_setup.py'
Feb 16 17:05:26 compute-0 sudo[65583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:26 compute-0 python3.9[65585]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:05:26 compute-0 sudo[65583]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:27 compute-0 sudo[65667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avekztulgickfowwvkqqkyemxzjzzfwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261525.55582-545-106216428771508/AnsiballZ_systemd.py'
Feb 16 17:05:27 compute-0 sudo[65667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:27 compute-0 python3.9[65669]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:05:27 compute-0 sudo[65667]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:28 compute-0 sudo[65821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdmhfmihjqcppgxchyeiedfqkbcdbbmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261528.068323-577-128660574116628/AnsiballZ_setup.py'
Feb 16 17:05:28 compute-0 sudo[65821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:28 compute-0 python3.9[65823]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:05:28 compute-0 sudo[65821]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:29 compute-0 sudo[65905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qduhsypokwoltnzdpvdxgxsuyfmpbldg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261528.068323-577-128660574116628/AnsiballZ_systemd.py'
Feb 16 17:05:29 compute-0 sudo[65905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:29 compute-0 python3.9[65907]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:05:29 compute-0 chronyd[829]: chronyd exiting
Feb 16 17:05:29 compute-0 systemd[1]: Stopping NTP client/server...
Feb 16 17:05:29 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Feb 16 17:05:29 compute-0 systemd[1]: Stopped NTP client/server.
Feb 16 17:05:29 compute-0 systemd[1]: Starting NTP client/server...
Feb 16 17:05:29 compute-0 chronyd[65915]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 16 17:05:29 compute-0 chronyd[65915]: Frequency -24.545 +/- 0.115 ppm read from /var/lib/chrony/drift
Feb 16 17:05:29 compute-0 chronyd[65915]: Loaded seccomp filter (level 2)
Feb 16 17:05:29 compute-0 systemd[1]: Started NTP client/server.
Feb 16 17:05:29 compute-0 sudo[65905]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:30 compute-0 sshd-session[61059]: Connection closed by 192.168.122.30 port 53710
Feb 16 17:05:30 compute-0 sshd-session[61056]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:05:30 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Feb 16 17:05:30 compute-0 systemd[1]: session-14.scope: Consumed 24.276s CPU time.
Feb 16 17:05:30 compute-0 systemd-logind[821]: Session 14 logged out. Waiting for processes to exit.
Feb 16 17:05:30 compute-0 systemd-logind[821]: Removed session 14.
Feb 16 17:05:36 compute-0 sshd-session[65941]: Accepted publickey for zuul from 192.168.122.30 port 45926 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:05:36 compute-0 systemd-logind[821]: New session 15 of user zuul.
Feb 16 17:05:36 compute-0 systemd[1]: Started Session 15 of User zuul.
Feb 16 17:05:36 compute-0 sshd-session[65941]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:05:37 compute-0 python3.9[66094]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:05:38 compute-0 sudo[66248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvbcmtdwktinblmecoyvnvkusnpxkghl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261538.0233936-46-17505509990365/AnsiballZ_file.py'
Feb 16 17:05:38 compute-0 sudo[66248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:38 compute-0 python3.9[66250]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:38 compute-0 sudo[66248]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:39 compute-0 sudo[66423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrryffmpwwfzipnbbghhuvslsykvjenp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261538.945215-62-81360803599816/AnsiballZ_stat.py'
Feb 16 17:05:39 compute-0 sudo[66423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:39 compute-0 python3.9[66425]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:39 compute-0 sudo[66423]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:39 compute-0 sudo[66501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fakunhtgygibheecfnwfiaapidysscmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261538.945215-62-81360803599816/AnsiballZ_file.py'
Feb 16 17:05:39 compute-0 sudo[66501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:40 compute-0 python3.9[66503]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.80szqykd recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:40 compute-0 sudo[66501]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:40 compute-0 sudo[66653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqvzqyploqxacovwfceczxatgmsekpza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261540.6004796-102-151149327926228/AnsiballZ_stat.py'
Feb 16 17:05:40 compute-0 sudo[66653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:41 compute-0 python3.9[66655]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:41 compute-0 sudo[66653]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:41 compute-0 sudo[66776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwblwlvqpmlvvoircnypnvnmzbxyjcus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261540.6004796-102-151149327926228/AnsiballZ_copy.py'
Feb 16 17:05:41 compute-0 sudo[66776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:41 compute-0 python3.9[66778]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261540.6004796-102-151149327926228/.source _original_basename=.vt3g5sma follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:41 compute-0 sudo[66776]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:42 compute-0 sudo[66928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcchtjbpgysimdsjtitfriacnxcrmvmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261541.825139-134-173408997282884/AnsiballZ_file.py'
Feb 16 17:05:42 compute-0 sudo[66928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:42 compute-0 python3.9[66930]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:05:42 compute-0 sudo[66928]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:42 compute-0 sudo[67080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csewptvajpnboehnxjcismjqunyoylad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261542.7499604-150-26716507364826/AnsiballZ_stat.py'
Feb 16 17:05:43 compute-0 sudo[67080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:43 compute-0 python3.9[67082]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:43 compute-0 sudo[67080]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:43 compute-0 sudo[67203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcwhnilbwjsryboxnsnqtqqdwtvyxens ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261542.7499604-150-26716507364826/AnsiballZ_copy.py'
Feb 16 17:05:43 compute-0 sudo[67203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:43 compute-0 python3.9[67205]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261542.7499604-150-26716507364826/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:05:43 compute-0 sudo[67203]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:44 compute-0 sudo[67355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilfyztvkquweihutyzynusjkpmmkobrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261543.9911275-150-238378242854595/AnsiballZ_stat.py'
Feb 16 17:05:44 compute-0 sudo[67355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:44 compute-0 python3.9[67357]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:44 compute-0 sudo[67355]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:44 compute-0 sudo[67478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usbbtmmfetqwuijiiqqpdooxwpnqowic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261543.9911275-150-238378242854595/AnsiballZ_copy.py'
Feb 16 17:05:44 compute-0 sudo[67478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:45 compute-0 python3.9[67480]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261543.9911275-150-238378242854595/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:05:45 compute-0 sudo[67478]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:45 compute-0 sudo[67630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upkvhojilrqvipluhuggrpbnxjjhlhrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261545.1933646-208-281371516521998/AnsiballZ_file.py'
Feb 16 17:05:45 compute-0 sudo[67630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:45 compute-0 python3.9[67632]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:45 compute-0 sudo[67630]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:46 compute-0 sudo[67782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttjpkpwobrxzwphiliygbjjjkndssnin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261545.8073025-224-190738631061595/AnsiballZ_stat.py'
Feb 16 17:05:46 compute-0 sudo[67782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:46 compute-0 python3.9[67784]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:46 compute-0 sudo[67782]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:46 compute-0 sudo[67905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jajgtdtyoutohyhacvwwnenccdnwdktu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261545.8073025-224-190738631061595/AnsiballZ_copy.py'
Feb 16 17:05:46 compute-0 sudo[67905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:46 compute-0 python3.9[67907]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261545.8073025-224-190738631061595/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:46 compute-0 sudo[67905]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:47 compute-0 sudo[68058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-banattnjolthlxzloozwdnzdlyrwudrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261547.1348505-254-195061616586359/AnsiballZ_stat.py'
Feb 16 17:05:47 compute-0 sudo[68058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:47 compute-0 python3.9[68060]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:47 compute-0 sudo[68058]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:47 compute-0 sudo[68181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvwdkfaicqxjdjsrqvitklmlbgydijjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261547.1348505-254-195061616586359/AnsiballZ_copy.py'
Feb 16 17:05:47 compute-0 sudo[68181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:48 compute-0 python3.9[68183]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261547.1348505-254-195061616586359/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:48 compute-0 sudo[68181]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:48 compute-0 sudo[68333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itkizslisusxnjnuljlsqrywaadqgxzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261548.3605182-284-273603744002659/AnsiballZ_systemd.py'
Feb 16 17:05:48 compute-0 sudo[68333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:49 compute-0 python3.9[68335]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:05:49 compute-0 systemd[1]: Reloading.
Feb 16 17:05:49 compute-0 systemd-rc-local-generator[68355]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:05:49 compute-0 systemd-sysv-generator[68362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:05:49 compute-0 systemd[1]: Reloading.
Feb 16 17:05:49 compute-0 systemd-sysv-generator[68406]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:05:49 compute-0 systemd-rc-local-generator[68403]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:05:49 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Feb 16 17:05:49 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Feb 16 17:05:49 compute-0 sudo[68333]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:50 compute-0 sudo[68574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sykpatyvhptbkjkofvhpywtbofcjpzku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261549.9944122-300-139634499063217/AnsiballZ_stat.py'
Feb 16 17:05:50 compute-0 sudo[68574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:50 compute-0 python3.9[68576]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:50 compute-0 sudo[68574]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:50 compute-0 sudo[68697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcvcwpmfshtqljphejqjwggsnbgsxszm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261549.9944122-300-139634499063217/AnsiballZ_copy.py'
Feb 16 17:05:50 compute-0 sudo[68697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:51 compute-0 python3.9[68699]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261549.9944122-300-139634499063217/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:51 compute-0 sudo[68697]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:51 compute-0 sudo[68849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scypvoybdvaczbmpllbuxiiwxtimbfyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261551.2535708-330-6638441460918/AnsiballZ_stat.py'
Feb 16 17:05:51 compute-0 sudo[68849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:51 compute-0 python3.9[68851]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:05:51 compute-0 sudo[68849]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:52 compute-0 sudo[68972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cecklmrizbdgnrbqumegewcwqsiieswg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261551.2535708-330-6638441460918/AnsiballZ_copy.py'
Feb 16 17:05:52 compute-0 sudo[68972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:52 compute-0 python3.9[68974]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261551.2535708-330-6638441460918/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:05:52 compute-0 sudo[68972]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:52 compute-0 sudo[69124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sscgbqulasxamnqkgenqcswweflsbszq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261552.4289672-360-185025796509314/AnsiballZ_systemd.py'
Feb 16 17:05:52 compute-0 sudo[69124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:52 compute-0 python3.9[69126]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:05:53 compute-0 systemd[1]: Reloading.
Feb 16 17:05:53 compute-0 systemd-sysv-generator[69157]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:05:53 compute-0 systemd-rc-local-generator[69153]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:05:53 compute-0 systemd[1]: Reloading.
Feb 16 17:05:53 compute-0 systemd-rc-local-generator[69198]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:05:53 compute-0 systemd-sysv-generator[69202]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:05:53 compute-0 systemd[1]: Starting Create netns directory...
Feb 16 17:05:53 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 16 17:05:53 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 16 17:05:53 compute-0 systemd[1]: Finished Create netns directory.
Feb 16 17:05:53 compute-0 sudo[69124]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:54 compute-0 python3.9[69366]: ansible-ansible.builtin.service_facts Invoked
Feb 16 17:05:54 compute-0 network[69383]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 17:05:54 compute-0 network[69384]: 'network-scripts' will be removed from distribution in near future.
Feb 16 17:05:54 compute-0 network[69385]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 17:05:56 compute-0 sudo[69646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phialqlztbtrinouoeyawdsfpjfjivno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261556.5915058-392-59613516051523/AnsiballZ_systemd.py'
Feb 16 17:05:56 compute-0 sudo[69646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:57 compute-0 python3.9[69648]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:05:57 compute-0 systemd[1]: Reloading.
Feb 16 17:05:57 compute-0 systemd-rc-local-generator[69674]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:05:57 compute-0 systemd-sysv-generator[69678]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:05:57 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Feb 16 17:05:57 compute-0 iptables.init[69696]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Feb 16 17:05:57 compute-0 iptables.init[69696]: iptables: Flushing firewall rules: [  OK  ]
Feb 16 17:05:57 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Feb 16 17:05:57 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Feb 16 17:05:57 compute-0 sudo[69646]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:58 compute-0 sudo[69890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvorokktxcikuifpekidyzezpppdfbec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261557.8614957-392-38543518577159/AnsiballZ_systemd.py'
Feb 16 17:05:58 compute-0 sudo[69890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:58 compute-0 python3.9[69892]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:05:58 compute-0 sudo[69890]: pam_unix(sudo:session): session closed for user root
Feb 16 17:05:58 compute-0 sudo[70044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvcvvcbbvnuwvcodreikhrpywrzvgznd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261558.6958349-424-10022584073530/AnsiballZ_systemd.py'
Feb 16 17:05:58 compute-0 sudo[70044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:05:59 compute-0 python3.9[70046]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:05:59 compute-0 systemd[1]: Reloading.
Feb 16 17:05:59 compute-0 systemd-rc-local-generator[70072]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:05:59 compute-0 systemd-sysv-generator[70077]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:05:59 compute-0 systemd[1]: Starting Netfilter Tables...
Feb 16 17:05:59 compute-0 systemd[1]: Finished Netfilter Tables.
Feb 16 17:05:59 compute-0 sudo[70044]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:00 compute-0 sudo[70243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyegbelxhalyfpscioxjqkfncowiolha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261559.779138-440-62662241745187/AnsiballZ_command.py'
Feb 16 17:06:00 compute-0 sudo[70243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:00 compute-0 python3.9[70245]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:06:00 compute-0 sudo[70243]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:01 compute-0 sudo[70396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sznxlvtoilfexiedtlnvecybdygkarfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261561.153571-468-152574418937460/AnsiballZ_stat.py'
Feb 16 17:06:01 compute-0 sudo[70396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:01 compute-0 python3.9[70398]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:06:01 compute-0 sudo[70396]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:02 compute-0 sudo[70521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puuwqzluavnzjpoykvpewlsgtbddaypm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261561.153571-468-152574418937460/AnsiballZ_copy.py'
Feb 16 17:06:02 compute-0 sudo[70521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:02 compute-0 python3.9[70523]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261561.153571-468-152574418937460/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:02 compute-0 sudo[70521]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:02 compute-0 sudo[70674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyhaktuirsexsfzgkmxmrvaujnfausxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261562.4760008-498-267419282897969/AnsiballZ_systemd.py'
Feb 16 17:06:02 compute-0 sudo[70674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:02 compute-0 python3.9[70676]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:06:03 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Feb 16 17:06:03 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Feb 16 17:06:03 compute-0 sshd[1021]: Received SIGHUP; restarting.
Feb 16 17:06:03 compute-0 sshd[1021]: Server listening on 0.0.0.0 port 22.
Feb 16 17:06:03 compute-0 sshd[1021]: Server listening on :: port 22.
Feb 16 17:06:03 compute-0 sudo[70674]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:03 compute-0 sudo[70830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prppugiizntljhtsukxbzsurxwvupaqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261563.294397-514-239558068219174/AnsiballZ_file.py'
Feb 16 17:06:03 compute-0 sudo[70830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:03 compute-0 python3.9[70832]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:03 compute-0 sudo[70830]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:04 compute-0 sudo[70982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrvihvftknriuknayippboxvftewfyoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261563.9647107-530-271751858496413/AnsiballZ_stat.py'
Feb 16 17:06:04 compute-0 sudo[70982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:04 compute-0 python3.9[70984]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:06:04 compute-0 sudo[70982]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:04 compute-0 sudo[71105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agwrphfwxukfjtdhudducfxoqapqbsle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261563.9647107-530-271751858496413/AnsiballZ_copy.py'
Feb 16 17:06:04 compute-0 sudo[71105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:05 compute-0 python3.9[71107]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261563.9647107-530-271751858496413/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:05 compute-0 sudo[71105]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:05 compute-0 sudo[71257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjgvwuuzdyktovkbvkdgbqiedurzkihm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261565.3879764-566-278254451289279/AnsiballZ_timezone.py'
Feb 16 17:06:05 compute-0 sudo[71257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:06 compute-0 python3.9[71259]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 16 17:06:06 compute-0 systemd[1]: Starting Time & Date Service...
Feb 16 17:06:06 compute-0 systemd[1]: Started Time & Date Service.
Feb 16 17:06:06 compute-0 sudo[71257]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:07 compute-0 sudo[71413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nocxtmrnkgcsyxjjmpyrpwpgwsobahbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261567.498578-584-162257414838639/AnsiballZ_file.py'
Feb 16 17:06:07 compute-0 sudo[71413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:07 compute-0 python3.9[71415]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:07 compute-0 sudo[71413]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:08 compute-0 sudo[71565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqomeeflpqcdhbnwlqzslgicbygsvgjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261568.468978-600-276819538025512/AnsiballZ_stat.py'
Feb 16 17:06:08 compute-0 sudo[71565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:08 compute-0 python3.9[71567]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:06:08 compute-0 sudo[71565]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:09 compute-0 sudo[71688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chtnwoulbukmvajuooqtrbtprkrjeixw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261568.468978-600-276819538025512/AnsiballZ_copy.py'
Feb 16 17:06:09 compute-0 sudo[71688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:09 compute-0 python3.9[71690]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261568.468978-600-276819538025512/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:09 compute-0 sudo[71688]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:09 compute-0 sudo[71840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtwmmojoxkvjyubkouqmsalslqobbkae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261569.6557934-630-198681472929666/AnsiballZ_stat.py'
Feb 16 17:06:09 compute-0 sudo[71840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:10 compute-0 python3.9[71842]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:06:10 compute-0 sudo[71840]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:10 compute-0 sudo[71963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyjbugzqutdgqmioknmsrwlbepdiarcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261569.6557934-630-198681472929666/AnsiballZ_copy.py'
Feb 16 17:06:10 compute-0 sudo[71963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:10 compute-0 python3.9[71965]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261569.6557934-630-198681472929666/.source.yaml _original_basename=.nub5pjae follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:10 compute-0 sudo[71963]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:11 compute-0 sudo[72115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imphleofyfawhdwnakbewiygzlhxfhvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261570.8240094-660-3511944234794/AnsiballZ_stat.py'
Feb 16 17:06:11 compute-0 sudo[72115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:11 compute-0 python3.9[72117]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:06:11 compute-0 sudo[72115]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:11 compute-0 sudo[72238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izdfcymfzxmuzjnaqbqxcrmmgiqfhjbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261570.8240094-660-3511944234794/AnsiballZ_copy.py'
Feb 16 17:06:11 compute-0 sudo[72238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:11 compute-0 python3.9[72240]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261570.8240094-660-3511944234794/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:11 compute-0 sudo[72238]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:12 compute-0 sudo[72390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uibseghucepepyzmyfekuthgjdfhkwfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261571.996027-690-164770982980848/AnsiballZ_command.py'
Feb 16 17:06:12 compute-0 sudo[72390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:12 compute-0 python3.9[72392]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:06:12 compute-0 sudo[72390]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:12 compute-0 sudo[72543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvcjfhiwdlrvlugqzjwucuasucmwylpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261572.6250656-706-112453227320995/AnsiballZ_command.py'
Feb 16 17:06:12 compute-0 sudo[72543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:13 compute-0 python3.9[72545]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:06:13 compute-0 sudo[72543]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:13 compute-0 sudo[72696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjyzmtcobjzirnstsshibtoxckkktnjl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771261573.2689908-722-34944093451367/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 17:06:13 compute-0 sudo[72696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:13 compute-0 python3[72698]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 17:06:13 compute-0 sudo[72696]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:14 compute-0 sudo[72848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvhdgmbcmhfchvttecakdfaodgglndit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261574.0813067-738-267905239225875/AnsiballZ_stat.py'
Feb 16 17:06:14 compute-0 sudo[72848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:14 compute-0 python3.9[72850]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:06:14 compute-0 sudo[72848]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:14 compute-0 sudo[72971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecwsekyszbeikstznyljypelzhudjwkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261574.0813067-738-267905239225875/AnsiballZ_copy.py'
Feb 16 17:06:14 compute-0 sudo[72971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:15 compute-0 python3.9[72973]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261574.0813067-738-267905239225875/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:15 compute-0 sudo[72971]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:15 compute-0 sudo[73123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frcphqgystgreoriluyqfzqaendxjxuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261575.221451-768-177436891858870/AnsiballZ_stat.py'
Feb 16 17:06:15 compute-0 sudo[73123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:15 compute-0 python3.9[73125]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:06:15 compute-0 sudo[73123]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:16 compute-0 sudo[73246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odqjwuqfanxrttjpfrymdwxmaviuyzev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261575.221451-768-177436891858870/AnsiballZ_copy.py'
Feb 16 17:06:16 compute-0 sudo[73246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:16 compute-0 python3.9[73248]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261575.221451-768-177436891858870/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:16 compute-0 sudo[73246]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:16 compute-0 sudo[73398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hctmabfxaybluivcsihuewsmrcqwrkkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261576.4635088-798-186086315507139/AnsiballZ_stat.py'
Feb 16 17:06:16 compute-0 sudo[73398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:16 compute-0 python3.9[73400]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:06:16 compute-0 sudo[73398]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:17 compute-0 sudo[73521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkpncsxujccbxjvrgsmmxugfjiglunmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261576.4635088-798-186086315507139/AnsiballZ_copy.py'
Feb 16 17:06:17 compute-0 sudo[73521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:17 compute-0 python3.9[73523]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261576.4635088-798-186086315507139/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:17 compute-0 sudo[73521]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:17 compute-0 sudo[73673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxkkytfnkbryhlctzpzascaozphnedsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261577.5868697-828-36472128996502/AnsiballZ_stat.py'
Feb 16 17:06:17 compute-0 sudo[73673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:18 compute-0 python3.9[73675]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:06:18 compute-0 sudo[73673]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:18 compute-0 sudo[73796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-entnadinunbxkvetfyficruqvpgmdqby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261577.5868697-828-36472128996502/AnsiballZ_copy.py'
Feb 16 17:06:18 compute-0 sudo[73796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:18 compute-0 python3.9[73798]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261577.5868697-828-36472128996502/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:18 compute-0 sudo[73796]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:19 compute-0 sudo[73948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icagfmvbrshgmsqoqbmitsnyngyvikly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261578.7332642-858-269210803240699/AnsiballZ_stat.py'
Feb 16 17:06:19 compute-0 sudo[73948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:19 compute-0 python3.9[73950]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:06:19 compute-0 sudo[73948]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:19 compute-0 sudo[74071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfbmlnwhowetghhujemtmfpauzyjceob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261578.7332642-858-269210803240699/AnsiballZ_copy.py'
Feb 16 17:06:19 compute-0 sudo[74071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:19 compute-0 python3.9[74073]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261578.7332642-858-269210803240699/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:19 compute-0 sudo[74071]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:20 compute-0 sudo[74223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysnkmaubosstxjutraoxmjdarvhorehh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261579.9411154-888-162249005729219/AnsiballZ_file.py'
Feb 16 17:06:20 compute-0 sudo[74223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:20 compute-0 python3.9[74225]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:20 compute-0 sudo[74223]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:20 compute-0 sudo[74375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isfkzlezdmckcxydcncrmknamgcqpopo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261580.5798306-904-138812662644949/AnsiballZ_command.py'
Feb 16 17:06:20 compute-0 sudo[74375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:21 compute-0 python3.9[74377]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:06:21 compute-0 sudo[74375]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:21 compute-0 sudo[74534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urnnquiofbrkvysrfgipgpmnxzejoldo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261581.2945614-920-242104537373898/AnsiballZ_blockinfile.py'
Feb 16 17:06:21 compute-0 sudo[74534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:21 compute-0 python3.9[74536]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:21 compute-0 sudo[74534]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:22 compute-0 sudo[74687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xumbfoqnmndicfghiovbnomctuixixqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261582.1801388-938-127367377920060/AnsiballZ_file.py'
Feb 16 17:06:22 compute-0 sudo[74687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:22 compute-0 python3.9[74689]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:22 compute-0 sudo[74687]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:23 compute-0 sudo[74839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfehuuziqgsrqwzbblxjtkesbuovvhxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261582.8082833-938-93794885752475/AnsiballZ_file.py'
Feb 16 17:06:23 compute-0 sudo[74839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:23 compute-0 python3.9[74841]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:23 compute-0 sudo[74839]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:23 compute-0 sudo[74991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsdpwglzjptdmyxwnpfrydhwxqybsrlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261583.4756248-968-97607573626567/AnsiballZ_mount.py'
Feb 16 17:06:23 compute-0 sudo[74991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:24 compute-0 python3.9[74993]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 16 17:06:24 compute-0 sudo[74991]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:24 compute-0 rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 17:06:24 compute-0 rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 17:06:24 compute-0 sudo[75145]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmnckfkhmzepdrzpzefxskqtbnstqfll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261584.246964-968-49790798903874/AnsiballZ_mount.py'
Feb 16 17:06:24 compute-0 sudo[75145]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:24 compute-0 python3.9[75147]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 16 17:06:24 compute-0 sudo[75145]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:25 compute-0 sshd-session[65944]: Connection closed by 192.168.122.30 port 45926
Feb 16 17:06:25 compute-0 sshd-session[65941]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:06:25 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Feb 16 17:06:25 compute-0 systemd[1]: session-15.scope: Consumed 31.965s CPU time.
Feb 16 17:06:25 compute-0 systemd-logind[821]: Session 15 logged out. Waiting for processes to exit.
Feb 16 17:06:25 compute-0 systemd-logind[821]: Removed session 15.
Feb 16 17:06:30 compute-0 sshd-session[75173]: Accepted publickey for zuul from 192.168.122.30 port 38774 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:06:30 compute-0 systemd-logind[821]: New session 16 of user zuul.
Feb 16 17:06:30 compute-0 systemd[1]: Started Session 16 of User zuul.
Feb 16 17:06:30 compute-0 sshd-session[75173]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:06:31 compute-0 sudo[75326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wggmagxftbmbsmtoqazfcyrlhufxuxah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261591.038801-22-74038298360912/AnsiballZ_tempfile.py'
Feb 16 17:06:31 compute-0 sudo[75326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:31 compute-0 python3.9[75328]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 16 17:06:31 compute-0 sudo[75326]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:32 compute-0 sudo[75478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbdxerzbcaatzhncospqgmpwbbxelxaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261591.8046865-46-162182266089039/AnsiballZ_stat.py'
Feb 16 17:06:32 compute-0 sudo[75478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:32 compute-0 python3.9[75480]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:06:32 compute-0 sudo[75478]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:33 compute-0 sudo[75630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toeeissdxhobillllnhwrrxtbofuhump ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261592.7748837-66-123562555756810/AnsiballZ_setup.py'
Feb 16 17:06:33 compute-0 sudo[75630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:33 compute-0 python3.9[75632]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:06:33 compute-0 sudo[75630]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:34 compute-0 sudo[75782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzzrwiuqlpoelqmsxzalljrwuxzuvgty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261593.919168-83-165333692633880/AnsiballZ_blockinfile.py'
Feb 16 17:06:34 compute-0 sudo[75782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:34 compute-0 python3.9[75784]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMl+7tT4s9vPnaBwSvQ7mRzYdijJ68eYLib4cPwTaFrQkzb42hC75WCfB+54TF7d1YP574J4/5iYEmapTJEwzFcrMJoppOb8J8RHAjO0KxMIGyTC/pbEx9ZwoYjeyWo5NDVhOr3IDtdpnamMsBerrR3ijBCBeP5Q6Xfk+no2oSO8/fPXgwWMK3JCoRooPq/KiNc+w9ASd44yCg7cKlvjrtuKNfb7sUT5e3zBvbrAQ9TP1KhDAZ/yjeUftjFQ8gi8aUU0q+btThHpRKXZ5DL7Rr752Kie90/5BT/1wre2zOBtGo39fftijmjn0LblwvxTqfvmiBOF9pG5usrXLvNqkALRVPYpIdlz7hS4jA8KqS52SlheYVDSf2tpibPiRXrc3Hcwr7tcBtAK0Z9y9QwkJKMuNYIMp0lYrggzGjFrd3oneXPmB215CTEVpOhvQ+m6YlpSjqrtWWzyxAn9dZFyWdyaoR9XPT3Rgl+L9PAKIoLy02Nq/+NJRlbuTtTQHbyrk=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPasWK5hVe1/x6Xi1bhzH/2SVtabnZeHaZnMnVHMBrws
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOHFtrXgNzJjOtcsLnMvIPbqQFNUuizSS9dGzHDC/qZ7rskmYKONjDLvvdxohPkO9EgXsAPKXvO02pZ10/o7gVU=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDtaRE0JFzLc0raeom2YUFdVyVneVWWaZH5nPzciX3IDJRsxFUYz5AcuBTiPjO6kvjek84upQ5qOk5SU3UO6qdeQo6mwIOJ4tqncDfq2Kq2De9mhDZMS+VEoCHLuv4/aXNmQS/DhKA6S2hI2DDBw9/JgoNWTV2a4CqpF7iDngroiOkXX17rHNKY9S+V/i9GPI2xrF7T85+A+6Vcv75tGxRUE3oO/Lrw4dO6NDU0bwjvx4XQ9JouR7e5oGSWOkO5cs28u29kDeWRg/h++EYS4bQv5kIBsLupbhuaSxn3TMrZkKHzcWBEzBzdQNx7JhrDw3Hcx2w320SP3s8dln9xfQwGtpV9W5XW7R8ATNaRQAeOb1/+64wFUosaakkoK1qY0vR4Q+KY+houfem/Cu5AX6w/5/ezRBc7IDlr/BGVwFHWN3zqvgl0FJ419XLakq4GCzQd3XvIkESFiKLv84hKRUi79ZkDkHyRnaFLfFV/+J9lGzBN8LPK85nhPqmgbsSxbcE=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILDT/gkEXgTYSbRJAIcciY5MR+QPxiNpNThx0iJCQ91t
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHCgzbUvRFyO58i+sLgpRvyb4rknAo0LfnATnyKXGsnn6WWEeeSrx1ig7NsjXvbQF8BeTC8fJds/ibq0Fqz4JZQ=
                                             create=True mode=0644 path=/tmp/ansible.24z7dvdf state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:34 compute-0 sudo[75782]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:35 compute-0 sudo[75934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ervchrnpgbdbxpvwmrxzixgiiofimesr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261594.6572623-99-259530810838671/AnsiballZ_command.py'
Feb 16 17:06:35 compute-0 sudo[75934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:35 compute-0 python3.9[75936]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.24z7dvdf' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:06:35 compute-0 sudo[75934]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:35 compute-0 sudo[76088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqiobriioazxgwiedebpepbpelnnvgkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261595.4445202-115-179147629736598/AnsiballZ_file.py'
Feb 16 17:06:35 compute-0 sudo[76088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:36 compute-0 python3.9[76090]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.24z7dvdf state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:36 compute-0 sudo[76088]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:36 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 16 17:06:36 compute-0 sshd-session[75176]: Connection closed by 192.168.122.30 port 38774
Feb 16 17:06:36 compute-0 sshd-session[75173]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:06:36 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Feb 16 17:06:36 compute-0 systemd[1]: session-16.scope: Consumed 2.985s CPU time.
Feb 16 17:06:36 compute-0 systemd-logind[821]: Session 16 logged out. Waiting for processes to exit.
Feb 16 17:06:36 compute-0 systemd-logind[821]: Removed session 16.
Feb 16 17:06:42 compute-0 sshd-session[76117]: Accepted publickey for zuul from 192.168.122.30 port 39414 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:06:42 compute-0 systemd-logind[821]: New session 17 of user zuul.
Feb 16 17:06:42 compute-0 systemd[1]: Started Session 17 of User zuul.
Feb 16 17:06:42 compute-0 sshd-session[76117]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:06:43 compute-0 python3.9[76270]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:06:43 compute-0 sudo[76424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwdbwcydjjdwlncadgudxjpbivmuzfng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261603.4069273-45-144867787321951/AnsiballZ_systemd.py'
Feb 16 17:06:43 compute-0 sudo[76424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:44 compute-0 python3.9[76426]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 16 17:06:44 compute-0 sudo[76424]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:44 compute-0 sudo[76578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avzqwrmspaouhwysovybuocvzttfhbfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261604.4668262-61-167563420834691/AnsiballZ_systemd.py'
Feb 16 17:06:44 compute-0 sudo[76578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:45 compute-0 python3.9[76580]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:06:45 compute-0 sudo[76578]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:45 compute-0 sudo[76731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvlinxersbruokvnrplwtxuxnsyjufrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261605.2726152-79-158860666972682/AnsiballZ_command.py'
Feb 16 17:06:45 compute-0 sudo[76731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:45 compute-0 python3.9[76733]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:06:45 compute-0 sudo[76731]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:46 compute-0 sudo[76884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpavsurcjhpnkqesfhxrytrwsqrixsnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261606.1314993-95-252353591030891/AnsiballZ_stat.py'
Feb 16 17:06:46 compute-0 sudo[76884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:46 compute-0 python3.9[76886]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:06:46 compute-0 sudo[76884]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:47 compute-0 sudo[77038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zefrpknvypehmpqyljywbncfvfmsvhia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261606.9501767-111-25829698874002/AnsiballZ_command.py'
Feb 16 17:06:47 compute-0 sudo[77038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:47 compute-0 python3.9[77040]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:06:47 compute-0 sudo[77038]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:47 compute-0 sudo[77193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecfclvyuxbvrrtrmunouxfvarjrrgdjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261607.6167767-127-207060683842840/AnsiballZ_file.py'
Feb 16 17:06:47 compute-0 sudo[77193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:48 compute-0 python3.9[77195]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:06:48 compute-0 sudo[77193]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:48 compute-0 sshd-session[76120]: Connection closed by 192.168.122.30 port 39414
Feb 16 17:06:48 compute-0 sshd-session[76117]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:06:48 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Feb 16 17:06:48 compute-0 systemd[1]: session-17.scope: Consumed 3.988s CPU time.
Feb 16 17:06:48 compute-0 systemd-logind[821]: Session 17 logged out. Waiting for processes to exit.
Feb 16 17:06:48 compute-0 systemd-logind[821]: Removed session 17.
Feb 16 17:06:54 compute-0 sshd-session[77221]: Accepted publickey for zuul from 192.168.122.30 port 50646 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:06:54 compute-0 systemd-logind[821]: New session 18 of user zuul.
Feb 16 17:06:54 compute-0 systemd[1]: Started Session 18 of User zuul.
Feb 16 17:06:54 compute-0 sshd-session[77221]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:06:55 compute-0 python3.9[77374]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:06:55 compute-0 sudo[77528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhmyvtpylkpdgyeertlczjmudpdmrkcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261615.4086146-48-53972997424386/AnsiballZ_setup.py'
Feb 16 17:06:55 compute-0 sudo[77528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:55 compute-0 python3.9[77530]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:06:56 compute-0 sudo[77528]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:56 compute-0 sudo[77612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkkhqmnjwzaaoiatyxntbdzaruzcrvfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261615.4086146-48-53972997424386/AnsiballZ_dnf.py'
Feb 16 17:06:56 compute-0 sudo[77612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:06:56 compute-0 python3.9[77614]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 16 17:06:58 compute-0 sudo[77612]: pam_unix(sudo:session): session closed for user root
Feb 16 17:06:58 compute-0 python3.9[77765]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:06:59 compute-0 python3.9[77916]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 17:07:00 compute-0 python3.9[78066]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:07:01 compute-0 python3.9[78216]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:07:01 compute-0 sshd-session[77224]: Connection closed by 192.168.122.30 port 50646
Feb 16 17:07:01 compute-0 sshd-session[77221]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:07:01 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Feb 16 17:07:01 compute-0 systemd[1]: session-18.scope: Consumed 5.309s CPU time.
Feb 16 17:07:01 compute-0 systemd-logind[821]: Session 18 logged out. Waiting for processes to exit.
Feb 16 17:07:01 compute-0 systemd-logind[821]: Removed session 18.
Feb 16 17:07:07 compute-0 sshd-session[78241]: Accepted publickey for zuul from 192.168.122.30 port 55192 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:07:07 compute-0 systemd-logind[821]: New session 19 of user zuul.
Feb 16 17:07:07 compute-0 systemd[1]: Started Session 19 of User zuul.
Feb 16 17:07:07 compute-0 sshd-session[78241]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:07:08 compute-0 python3.9[78394]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:07:09 compute-0 sudo[78548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqianeklfukrspumwwyqnxpugcjrfraf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261629.1533148-80-250309087787300/AnsiballZ_file.py'
Feb 16 17:07:09 compute-0 sudo[78548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:09 compute-0 python3.9[78550]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:09 compute-0 sudo[78548]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:10 compute-0 sudo[78700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwxyatennpvxblwdkrngnkurtccczdks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261629.8175404-80-116404396053578/AnsiballZ_file.py'
Feb 16 17:07:10 compute-0 sudo[78700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:10 compute-0 python3.9[78702]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:10 compute-0 sudo[78700]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:10 compute-0 sudo[78852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqfwrnifuegxqelhhgjbwkplcvyecpew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261630.461562-112-174298366309123/AnsiballZ_stat.py'
Feb 16 17:07:10 compute-0 sudo[78852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:10 compute-0 python3.9[78854]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:10 compute-0 sudo[78852]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:11 compute-0 sudo[78975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpeheqwzeniihdbhbukpizceiwphjygf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261630.461562-112-174298366309123/AnsiballZ_copy.py'
Feb 16 17:07:11 compute-0 sudo[78975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:11 compute-0 python3.9[78977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261630.461562-112-174298366309123/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=eb7d596ada28e92303bcd0b974abd85816a46d6e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:11 compute-0 sudo[78975]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:11 compute-0 sudo[79127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwqcqhbhevejgfraahvwrmifrhrymnwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261631.636052-112-140810022191508/AnsiballZ_stat.py'
Feb 16 17:07:11 compute-0 sudo[79127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:12 compute-0 python3.9[79129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:12 compute-0 sudo[79127]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:12 compute-0 sudo[79250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkxxjryxmojyxhaqbjimyspictepjyrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261631.636052-112-140810022191508/AnsiballZ_copy.py'
Feb 16 17:07:12 compute-0 sudo[79250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:12 compute-0 python3.9[79252]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261631.636052-112-140810022191508/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f68af6a35241af08b05612d074d3aaf2a71ae633 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:12 compute-0 sudo[79250]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:12 compute-0 sudo[79402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdgpmrymoguiopfpvhrimixspgdecvbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261632.6188557-112-183009893772987/AnsiballZ_stat.py'
Feb 16 17:07:12 compute-0 sudo[79402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:13 compute-0 python3.9[79404]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:13 compute-0 sudo[79402]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:13 compute-0 sudo[79525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfqpbtkknvnqzprenemydiamjipkkocb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261632.6188557-112-183009893772987/AnsiballZ_copy.py'
Feb 16 17:07:13 compute-0 sudo[79525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:13 compute-0 python3.9[79527]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261632.6188557-112-183009893772987/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=b98b24493a1287867c8f12b7afc72d250ab511a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:13 compute-0 sudo[79525]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:14 compute-0 sudo[79677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckhjebumbbyhjbsohdtmpbdafvbzbnxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261633.962934-201-129039993656399/AnsiballZ_file.py'
Feb 16 17:07:14 compute-0 sudo[79677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:14 compute-0 python3.9[79679]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:14 compute-0 sudo[79677]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:14 compute-0 sudo[79829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oabgpzjyxmhbgccgawytptdixqfwnkpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261634.5403004-201-229602429089488/AnsiballZ_file.py'
Feb 16 17:07:14 compute-0 sudo[79829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:14 compute-0 python3.9[79831]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:14 compute-0 sudo[79829]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:15 compute-0 sudo[79981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbmcawgqlcgonwjsqljpfjkqmcchxrdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261635.151164-232-272762555089229/AnsiballZ_stat.py'
Feb 16 17:07:15 compute-0 sudo[79981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:15 compute-0 python3.9[79983]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:15 compute-0 sudo[79981]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:16 compute-0 sudo[80104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igispqgaoeaxoigzrsvaoomazafqxxhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261635.151164-232-272762555089229/AnsiballZ_copy.py'
Feb 16 17:07:16 compute-0 sudo[80104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:16 compute-0 python3.9[80106]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261635.151164-232-272762555089229/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=d089308b2a82f698119515e8a5610a78fd881742 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:16 compute-0 sudo[80104]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:16 compute-0 sudo[80256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbrougtanqkbufrybdvemfwvbshmsstj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261636.389957-232-66819544656069/AnsiballZ_stat.py'
Feb 16 17:07:16 compute-0 sudo[80256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:16 compute-0 python3.9[80258]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:16 compute-0 sudo[80256]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:17 compute-0 sudo[80379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wboufcuwlrzwzfcldzhspmicclsgwihk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261636.389957-232-66819544656069/AnsiballZ_copy.py'
Feb 16 17:07:17 compute-0 sudo[80379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:17 compute-0 python3.9[80381]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261636.389957-232-66819544656069/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=33655fe41c76dd5f53aad60a399996797e022888 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:17 compute-0 sudo[80379]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:17 compute-0 sudo[80531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itsvovipdbrsxfnfvbhcwmopxjrcufqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261637.564068-232-191977972316749/AnsiballZ_stat.py'
Feb 16 17:07:17 compute-0 sudo[80531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:18 compute-0 python3.9[80533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:18 compute-0 sudo[80531]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:18 compute-0 sudo[80654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olpibekisqljmwiwmypsxbubpekvzkgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261637.564068-232-191977972316749/AnsiballZ_copy.py'
Feb 16 17:07:18 compute-0 sudo[80654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:18 compute-0 python3.9[80656]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261637.564068-232-191977972316749/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=2c0d62bd9e2378cd2790328d6dc09fe05031d410 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:18 compute-0 sudo[80654]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:19 compute-0 sudo[80806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xihhlfbsthhrtqndfxxcezzgqiodyhbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261639.3228946-323-149417323972434/AnsiballZ_file.py'
Feb 16 17:07:19 compute-0 sudo[80806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:19 compute-0 python3.9[80808]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:19 compute-0 sudo[80806]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:20 compute-0 sudo[80958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynchymyirjukbjgvlfklkknxsmcasnef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261640.1741192-323-27442586962248/AnsiballZ_file.py'
Feb 16 17:07:20 compute-0 sudo[80958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:20 compute-0 python3.9[80960]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:20 compute-0 sudo[80958]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:21 compute-0 sudo[81110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgdtqmyppidrqlwabvuffcstoukqceju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261641.0754404-352-256557448022808/AnsiballZ_stat.py'
Feb 16 17:07:21 compute-0 sudo[81110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:21 compute-0 python3.9[81112]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:21 compute-0 sudo[81110]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:21 compute-0 sudo[81233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxrrvhkzmkwauzxyzjljshdyefymsned ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261641.0754404-352-256557448022808/AnsiballZ_copy.py'
Feb 16 17:07:21 compute-0 sudo[81233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:22 compute-0 python3.9[81235]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261641.0754404-352-256557448022808/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=baf479163800064ca7736b33022d26ae76e9b3bc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:22 compute-0 sudo[81233]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:22 compute-0 sudo[81385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrzmdiruyynuyecahyizsrailvupshee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261642.3747501-352-164677024469561/AnsiballZ_stat.py'
Feb 16 17:07:22 compute-0 sudo[81385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:22 compute-0 python3.9[81387]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:22 compute-0 sudo[81385]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:23 compute-0 sudo[81508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfzeheaiowlxcvgcbynrjzgioiicqtgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261642.3747501-352-164677024469561/AnsiballZ_copy.py'
Feb 16 17:07:23 compute-0 sudo[81508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:23 compute-0 python3.9[81510]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261642.3747501-352-164677024469561/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=e82acd5edfa4bd72f1792e037e9a15625240e6cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:23 compute-0 sudo[81508]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:23 compute-0 sudo[81660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwsjpgtzgtdmsrycbqfkeogrypsoseto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261643.4307833-352-132935178918559/AnsiballZ_stat.py'
Feb 16 17:07:23 compute-0 sudo[81660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:23 compute-0 python3.9[81662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:23 compute-0 sudo[81660]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:24 compute-0 sudo[81783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrchdcbsddtuitgbizaqvvlnrxqzmebb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261643.4307833-352-132935178918559/AnsiballZ_copy.py'
Feb 16 17:07:24 compute-0 sudo[81783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:24 compute-0 python3.9[81785]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261643.4307833-352-132935178918559/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=42ab21595f1e298bed03e8ba4d170e58ca02e7ec backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:24 compute-0 sudo[81783]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:24 compute-0 sudo[81935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pywokmjspmxlmkhgshnevendvldtzsro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261644.5871115-442-261943053627198/AnsiballZ_file.py'
Feb 16 17:07:24 compute-0 sudo[81935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:25 compute-0 python3.9[81937]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:25 compute-0 sudo[81935]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:25 compute-0 sudo[82087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcyadvwlhqpmgpqjyxncdsbqbjytwzuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261645.139481-442-213770507024387/AnsiballZ_file.py'
Feb 16 17:07:25 compute-0 sudo[82087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:25 compute-0 python3.9[82089]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:25 compute-0 sudo[82087]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:25 compute-0 sudo[82239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hotbwsbnsasajupmpemfcdrvindyhwqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261645.7679992-477-127780729579233/AnsiballZ_stat.py'
Feb 16 17:07:25 compute-0 sudo[82239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:26 compute-0 python3.9[82241]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:26 compute-0 sudo[82239]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:26 compute-0 sudo[82362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvbpocouiybdorqnpxlwohckjocjzgwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261645.7679992-477-127780729579233/AnsiballZ_copy.py'
Feb 16 17:07:26 compute-0 sudo[82362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:26 compute-0 python3.9[82364]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261645.7679992-477-127780729579233/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=6d2867d7888017d588a7c6b1edbfe6ba38a2a973 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:26 compute-0 sudo[82362]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:27 compute-0 sudo[82514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dihjsnapwxeawvdiwsylerftlfutzwlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261646.8525524-477-248280850007071/AnsiballZ_stat.py'
Feb 16 17:07:27 compute-0 sudo[82514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:27 compute-0 python3.9[82516]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:27 compute-0 sudo[82514]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:27 compute-0 sudo[82637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdwcuxnnpbcnvnbnizaatrhssnkuuxkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261646.8525524-477-248280850007071/AnsiballZ_copy.py'
Feb 16 17:07:27 compute-0 sudo[82637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:27 compute-0 python3.9[82639]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261646.8525524-477-248280850007071/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=e82acd5edfa4bd72f1792e037e9a15625240e6cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:27 compute-0 sudo[82637]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:28 compute-0 sudo[82789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxuqiqsajgxzfsrcaptgufgxltvrhuir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261647.9694464-477-225159856548972/AnsiballZ_stat.py'
Feb 16 17:07:28 compute-0 sudo[82789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:28 compute-0 python3.9[82791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:28 compute-0 sudo[82789]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:28 compute-0 sudo[82912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqdmeyhaqwtzapbcelbfxlgebgohiiwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261647.9694464-477-225159856548972/AnsiballZ_copy.py'
Feb 16 17:07:28 compute-0 sudo[82912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:28 compute-0 python3.9[82914]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261647.9694464-477-225159856548972/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=db0ac72a887e20fd897a246114746c252b727ef7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:28 compute-0 sudo[82912]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:30 compute-0 sudo[83064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmcrxckqxmlbguwyxgckdtnxkpgjhsjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261649.777197-598-270217781150315/AnsiballZ_file.py'
Feb 16 17:07:30 compute-0 sudo[83064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:30 compute-0 python3.9[83066]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:30 compute-0 sudo[83064]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:30 compute-0 sudo[83216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yewdrzdgkakenvfqyondfvhbjykxcnzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261650.3723423-615-28842705786050/AnsiballZ_stat.py'
Feb 16 17:07:30 compute-0 sudo[83216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:30 compute-0 python3.9[83218]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:30 compute-0 sudo[83216]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:31 compute-0 sudo[83339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlmtadctgsakhxeewsugqnccomjkwkgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261650.3723423-615-28842705786050/AnsiballZ_copy.py'
Feb 16 17:07:31 compute-0 sudo[83339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:31 compute-0 python3.9[83341]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261650.3723423-615-28842705786050/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2666b00e89898ecfd58a5d594369d5356783239e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:31 compute-0 sudo[83339]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:31 compute-0 sudo[83491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agreqyniccljgjzbnhctbzwuqsgfzwfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261651.5467052-646-227271809887255/AnsiballZ_file.py'
Feb 16 17:07:31 compute-0 sudo[83491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:32 compute-0 python3.9[83493]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:32 compute-0 sudo[83491]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:32 compute-0 sudo[83643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijygqzgwswbxxrlkachassntsknzpyab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261652.2412007-663-253246641456378/AnsiballZ_stat.py'
Feb 16 17:07:32 compute-0 sudo[83643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:32 compute-0 python3.9[83645]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:32 compute-0 sudo[83643]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:33 compute-0 sudo[83766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfdvddoulznpyaggvlnqjazcjphakwgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261652.2412007-663-253246641456378/AnsiballZ_copy.py'
Feb 16 17:07:33 compute-0 sudo[83766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:33 compute-0 python3.9[83768]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261652.2412007-663-253246641456378/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2666b00e89898ecfd58a5d594369d5356783239e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:33 compute-0 sudo[83766]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:33 compute-0 sudo[83918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jigxdosgvwukkvbfwdcukocqjrbffpns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261653.3789465-696-207764761643729/AnsiballZ_file.py'
Feb 16 17:07:33 compute-0 sudo[83918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:33 compute-0 python3.9[83920]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:33 compute-0 sudo[83918]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:34 compute-0 sudo[84070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlztjktirtlfiebgobnclvbgxrokbtuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261654.0554295-716-150733810977170/AnsiballZ_stat.py'
Feb 16 17:07:34 compute-0 sudo[84070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:34 compute-0 python3.9[84072]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:34 compute-0 sudo[84070]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:34 compute-0 sudo[84193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppyrgvjrcwxzxanqlkycuhudasfbkmcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261654.0554295-716-150733810977170/AnsiballZ_copy.py'
Feb 16 17:07:34 compute-0 sudo[84193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:35 compute-0 python3.9[84195]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261654.0554295-716-150733810977170/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2666b00e89898ecfd58a5d594369d5356783239e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:35 compute-0 sudo[84193]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:35 compute-0 sudo[84345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbmgrvratwskboonbeeeypxtaradoiri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261655.2066977-749-54955301344739/AnsiballZ_file.py'
Feb 16 17:07:35 compute-0 sudo[84345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:35 compute-0 python3.9[84347]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:35 compute-0 sudo[84345]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:36 compute-0 sudo[84497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yownsdrpwunenoiwlniiyhqnkinrwhxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261655.7979758-766-177852062636795/AnsiballZ_stat.py'
Feb 16 17:07:36 compute-0 sudo[84497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:36 compute-0 python3.9[84499]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:36 compute-0 sudo[84497]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:36 compute-0 sudo[84620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybqjsyhnxjkmqpuxtrrljvawjqyqsnpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261655.7979758-766-177852062636795/AnsiballZ_copy.py'
Feb 16 17:07:36 compute-0 sudo[84620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:36 compute-0 python3.9[84622]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261655.7979758-766-177852062636795/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2666b00e89898ecfd58a5d594369d5356783239e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:36 compute-0 sudo[84620]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:37 compute-0 sudo[84772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uggacougwkqtuwpaguujnlnxqydecqkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261656.9343584-797-29883513931425/AnsiballZ_file.py'
Feb 16 17:07:37 compute-0 sudo[84772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:37 compute-0 python3.9[84774]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:37 compute-0 sudo[84772]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:37 compute-0 sudo[84924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejuzcqfswdrdqjftbalhydqwkenrajwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261657.533803-814-224816412942797/AnsiballZ_stat.py'
Feb 16 17:07:37 compute-0 sudo[84924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:37 compute-0 python3.9[84926]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:37 compute-0 sudo[84924]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:38 compute-0 sudo[85047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rokcghojubbxjbrhfiokcharcvjtvmsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261657.533803-814-224816412942797/AnsiballZ_copy.py'
Feb 16 17:07:38 compute-0 sudo[85047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:38 compute-0 python3.9[85049]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261657.533803-814-224816412942797/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2666b00e89898ecfd58a5d594369d5356783239e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:38 compute-0 sudo[85047]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:38 compute-0 sudo[85199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzcdslljschizoufnumrfoyphhekhaqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261658.584768-846-184450081876013/AnsiballZ_file.py'
Feb 16 17:07:38 compute-0 sudo[85199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:39 compute-0 python3.9[85201]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:39 compute-0 sudo[85199]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:39 compute-0 chronyd[65915]: Selected source 23.159.16.194 (pool.ntp.org)
Feb 16 17:07:39 compute-0 sudo[85351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mevecdsnudzlvljxyvngjbgwtdtnetkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261659.1752057-863-66805862248612/AnsiballZ_stat.py'
Feb 16 17:07:39 compute-0 sudo[85351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:39 compute-0 python3.9[85353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:39 compute-0 sudo[85351]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:39 compute-0 sudo[85474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viezylcavpkysbgdhcszcgwaqpgavubd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261659.1752057-863-66805862248612/AnsiballZ_copy.py'
Feb 16 17:07:39 compute-0 sudo[85474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:40 compute-0 python3.9[85476]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261659.1752057-863-66805862248612/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2666b00e89898ecfd58a5d594369d5356783239e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:40 compute-0 sudo[85474]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:40 compute-0 sudo[85626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrqsmucbcasgcuewimogkpcuddonslmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261660.341164-894-109901908134385/AnsiballZ_file.py'
Feb 16 17:07:40 compute-0 sudo[85626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:40 compute-0 python3.9[85628]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:40 compute-0 sudo[85626]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:41 compute-0 sudo[85778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcxvccxgmpvtgqcskefpyctatyvpshcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261660.9306054-902-7478366897061/AnsiballZ_stat.py'
Feb 16 17:07:41 compute-0 sudo[85778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:41 compute-0 python3.9[85780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:41 compute-0 sudo[85778]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:41 compute-0 sudo[85901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-merxsrzlpjursbdnjxdmlrljvqiekrnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261660.9306054-902-7478366897061/AnsiballZ_copy.py'
Feb 16 17:07:41 compute-0 sudo[85901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:41 compute-0 python3.9[85903]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261660.9306054-902-7478366897061/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=2666b00e89898ecfd58a5d594369d5356783239e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:41 compute-0 sudo[85901]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:42 compute-0 sshd-session[78244]: Connection closed by 192.168.122.30 port 55192
Feb 16 17:07:42 compute-0 sshd-session[78241]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:07:42 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Feb 16 17:07:42 compute-0 systemd[1]: session-19.scope: Consumed 26.105s CPU time.
Feb 16 17:07:42 compute-0 systemd-logind[821]: Session 19 logged out. Waiting for processes to exit.
Feb 16 17:07:42 compute-0 systemd-logind[821]: Removed session 19.
Feb 16 17:07:47 compute-0 sshd-session[85928]: Accepted publickey for zuul from 192.168.122.30 port 38586 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:07:47 compute-0 systemd-logind[821]: New session 20 of user zuul.
Feb 16 17:07:47 compute-0 systemd[1]: Started Session 20 of User zuul.
Feb 16 17:07:47 compute-0 sshd-session[85928]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:07:48 compute-0 python3.9[86081]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:07:49 compute-0 sudo[86235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtgbsuzensjmfdpkrmiojurrnxoyfnky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261669.37978-48-242056471601787/AnsiballZ_file.py'
Feb 16 17:07:49 compute-0 sudo[86235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:50 compute-0 python3.9[86237]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:50 compute-0 sudo[86235]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:50 compute-0 sudo[86387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fafbnqcunoortopjmgabhhbbqlxevdon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261670.1509905-48-66585818768663/AnsiballZ_file.py'
Feb 16 17:07:50 compute-0 sudo[86387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:50 compute-0 python3.9[86389]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:07:50 compute-0 sudo[86387]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:51 compute-0 python3.9[86539]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:07:51 compute-0 sudo[86689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rviyjoibugkybffjmnwrttdsrjlzozex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261671.4455273-94-179401897387530/AnsiballZ_seboolean.py'
Feb 16 17:07:51 compute-0 sudo[86689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:52 compute-0 python3.9[86691]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 16 17:07:53 compute-0 sudo[86689]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:53 compute-0 sudo[86845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dydxxzicdroikmqmvtijsywjxygalera ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261673.2973225-114-280378964122010/AnsiballZ_setup.py'
Feb 16 17:07:53 compute-0 dbus-broker-launch[807]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 16 17:07:53 compute-0 sudo[86845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:53 compute-0 python3.9[86847]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:07:54 compute-0 sudo[86845]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:54 compute-0 sudo[86929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikmklfrewsspbnkfiqzcuolksqzlpcuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261673.2973225-114-280378964122010/AnsiballZ_dnf.py'
Feb 16 17:07:54 compute-0 sudo[86929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:54 compute-0 python3.9[86931]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:07:55 compute-0 sudo[86929]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:56 compute-0 sudo[87082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixgucsjduwiaeqphqywixtghjcocilby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261676.0806737-138-87207927571982/AnsiballZ_systemd.py'
Feb 16 17:07:56 compute-0 sudo[87082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:56 compute-0 python3.9[87084]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 17:07:57 compute-0 sudo[87082]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:57 compute-0 sudo[87237]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcyswczyfxeibjisfhpqggzlibydntjc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771261677.2256484-154-204217432268934/AnsiballZ_edpm_nftables_snippet.py'
Feb 16 17:07:57 compute-0 sudo[87237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:57 compute-0 python3[87239]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 16 17:07:57 compute-0 sudo[87237]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:58 compute-0 sudo[87389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pawbshktwbgborhihvneeddzpsjvgzoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261678.0751894-172-97438911749222/AnsiballZ_file.py'
Feb 16 17:07:58 compute-0 sudo[87389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:58 compute-0 python3.9[87391]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:58 compute-0 sudo[87389]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:59 compute-0 sudo[87541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymsybppfevpekovvajpjdzzmolbsdxgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261678.677772-188-112304484371474/AnsiballZ_stat.py'
Feb 16 17:07:59 compute-0 sudo[87541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:59 compute-0 python3.9[87543]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:07:59 compute-0 sudo[87541]: pam_unix(sudo:session): session closed for user root
Feb 16 17:07:59 compute-0 sudo[87619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwzxnniebntztwvvmztlmjlnznhjexnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261678.677772-188-112304484371474/AnsiballZ_file.py'
Feb 16 17:07:59 compute-0 sudo[87619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:07:59 compute-0 python3.9[87621]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:07:59 compute-0 sudo[87619]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:00 compute-0 sudo[87771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtcmhwzhhvqqbbcanoccdzaqzerhozke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261679.9189708-212-40650194105484/AnsiballZ_stat.py'
Feb 16 17:08:00 compute-0 sudo[87771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:00 compute-0 python3.9[87773]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:00 compute-0 sudo[87771]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:00 compute-0 sudo[87849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdrnzjwadbruagxpymqhxzyrfxffnlda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261679.9189708-212-40650194105484/AnsiballZ_file.py'
Feb 16 17:08:00 compute-0 sudo[87849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:00 compute-0 python3.9[87851]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hab6ts7y recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:00 compute-0 sudo[87849]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:01 compute-0 sudo[88001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxmogzwpugzwcwcwmxzrdxymsblcgmag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261680.9284503-236-157482451524779/AnsiballZ_stat.py'
Feb 16 17:08:01 compute-0 sudo[88001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:01 compute-0 python3.9[88003]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:01 compute-0 sudo[88001]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:01 compute-0 sudo[88079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfmhqsvsdldskxlltlwteuzsxfpaniam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261680.9284503-236-157482451524779/AnsiballZ_file.py'
Feb 16 17:08:01 compute-0 sudo[88079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:01 compute-0 python3.9[88081]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:01 compute-0 sudo[88079]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:02 compute-0 sudo[88231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtthltjhoydftwbstwwllursyosetgoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261682.0777986-262-46902908653498/AnsiballZ_command.py'
Feb 16 17:08:02 compute-0 sudo[88231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:02 compute-0 python3.9[88233]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:08:02 compute-0 sudo[88231]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:03 compute-0 sudo[88384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsidymgyarqacnulxnjdahtzxykfaobx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771261682.9173248-278-129539355111380/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 17:08:03 compute-0 sudo[88384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:03 compute-0 python3[88386]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 17:08:03 compute-0 sudo[88384]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:04 compute-0 sudo[88536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjzhslubqkbmnlrkyesidrfldsnlaagt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261683.717844-294-227946534762021/AnsiballZ_stat.py'
Feb 16 17:08:04 compute-0 sudo[88536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:04 compute-0 python3.9[88538]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:04 compute-0 sudo[88536]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:04 compute-0 sudo[88661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvavnezzpqijaiyuxbeyasbvdkmvhdrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261683.717844-294-227946534762021/AnsiballZ_copy.py'
Feb 16 17:08:04 compute-0 sudo[88661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:04 compute-0 python3.9[88663]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261683.717844-294-227946534762021/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:04 compute-0 sudo[88661]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:05 compute-0 sudo[88813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqqfhbjycfnogqszuysccfkrcljlklfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261685.0270123-324-183397798288842/AnsiballZ_stat.py'
Feb 16 17:08:05 compute-0 sudo[88813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:05 compute-0 python3.9[88815]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:05 compute-0 sudo[88813]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:05 compute-0 sudo[88938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdpalygqahoapvmvnpoxquealbqxnjtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261685.0270123-324-183397798288842/AnsiballZ_copy.py'
Feb 16 17:08:05 compute-0 sudo[88938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:05 compute-0 python3.9[88940]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261685.0270123-324-183397798288842/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:06 compute-0 sudo[88938]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:06 compute-0 sudo[89090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrdzevskrjlpozkntiulfzhghmjqhkrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261686.1371372-354-98569821519182/AnsiballZ_stat.py'
Feb 16 17:08:06 compute-0 sudo[89090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:06 compute-0 python3.9[89092]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:06 compute-0 sudo[89090]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:06 compute-0 sudo[89215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gztfjqpnfxahqljrbgevquvnxhdetoak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261686.1371372-354-98569821519182/AnsiballZ_copy.py'
Feb 16 17:08:06 compute-0 sudo[89215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:07 compute-0 python3.9[89217]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261686.1371372-354-98569821519182/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:07 compute-0 sudo[89215]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:07 compute-0 sudo[89367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewprhtqjxfpptxdkmavtmjlmkryhtfju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261687.1793687-384-124545016066510/AnsiballZ_stat.py'
Feb 16 17:08:07 compute-0 sudo[89367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:07 compute-0 python3.9[89369]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:07 compute-0 sudo[89367]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:08 compute-0 sudo[89492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffjgbwkxpmhemmpvbrnvwainrvtlvarc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261687.1793687-384-124545016066510/AnsiballZ_copy.py'
Feb 16 17:08:08 compute-0 sudo[89492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:08 compute-0 python3.9[89494]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261687.1793687-384-124545016066510/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:08 compute-0 sudo[89492]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:08 compute-0 sudo[89646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vydmitqdwtjcyfoebqxephjljyudffdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261688.396226-414-132396349047717/AnsiballZ_stat.py'
Feb 16 17:08:08 compute-0 sudo[89646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:08 compute-0 python3.9[89648]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:08 compute-0 sudo[89646]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:09 compute-0 sudo[89771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgzmscsyotpqxfojcevjsxdfeauesexr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261688.396226-414-132396349047717/AnsiballZ_copy.py'
Feb 16 17:08:09 compute-0 sudo[89771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:09 compute-0 python3.9[89773]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261688.396226-414-132396349047717/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:09 compute-0 sudo[89771]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:09 compute-0 sudo[89923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqrnqltjxdbwrzyrtptrtqwdaxubzwjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261689.5039084-444-128697398187339/AnsiballZ_file.py'
Feb 16 17:08:09 compute-0 sudo[89923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:09 compute-0 python3.9[89925]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:10 compute-0 sudo[89923]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:10 compute-0 sudo[90075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qttmjwxanchrwwlamsdkjzrjppswfkvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261690.185626-460-134923195310769/AnsiballZ_command.py'
Feb 16 17:08:10 compute-0 sudo[90075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:10 compute-0 python3.9[90077]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:08:10 compute-0 sudo[90075]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:11 compute-0 sshd-session[89555]: Received disconnect from 27.190.15.128 port 51562:11:  [preauth]
Feb 16 17:08:11 compute-0 sshd-session[89555]: Disconnected from authenticating user root 27.190.15.128 port 51562 [preauth]
Feb 16 17:08:11 compute-0 sudo[90230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxzujjqinowanzabwsnkopmghsfmrzqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261690.8340125-476-239649596290878/AnsiballZ_blockinfile.py'
Feb 16 17:08:11 compute-0 sudo[90230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:11 compute-0 python3.9[90232]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:11 compute-0 sudo[90230]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:12 compute-0 sudo[90382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vklpzttwbhomxjkpqzwdhihiwcmzvcze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261691.8494174-494-73569488960547/AnsiballZ_command.py'
Feb 16 17:08:12 compute-0 sudo[90382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:12 compute-0 python3.9[90384]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:08:12 compute-0 sudo[90382]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:12 compute-0 sudo[90535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrprkwdaetcwjqsfgcrapnpbaglzpdyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261692.506093-510-71593932133379/AnsiballZ_stat.py'
Feb 16 17:08:12 compute-0 sudo[90535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:12 compute-0 python3.9[90537]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:08:12 compute-0 sudo[90535]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:13 compute-0 sudo[90689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbsoyaysqtwczlzehfkuyfnfdkxfhhtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261693.106817-526-214882024909544/AnsiballZ_command.py'
Feb 16 17:08:13 compute-0 sudo[90689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:13 compute-0 python3.9[90691]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:08:13 compute-0 sudo[90689]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:14 compute-0 sudo[90845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwgupbsifljaryvqxaxgsnyxuxjfpikg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261693.8037665-542-150984001851014/AnsiballZ_file.py'
Feb 16 17:08:14 compute-0 sudo[90845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:14 compute-0 python3.9[90847]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:14 compute-0 sudo[90845]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:15 compute-0 python3.9[90997]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:08:16 compute-0 sudo[91148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdoqtrxdjtsqhyxogzmzhpowtqmeqwbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261696.001345-622-235569474999327/AnsiballZ_command.py'
Feb 16 17:08:16 compute-0 sudo[91148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:16 compute-0 python3.9[91150]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:9f:1d:bd:e8" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:08:16 compute-0 ovs-vsctl[91151]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:9f:1d:bd:e8 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 16 17:08:16 compute-0 sudo[91148]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:16 compute-0 sudo[91301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuuktubkxlkxpixgijzdkxjhminhhosd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261696.7048812-640-86393176059004/AnsiballZ_command.py'
Feb 16 17:08:16 compute-0 sudo[91301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:17 compute-0 python3.9[91303]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:08:17 compute-0 sudo[91301]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:17 compute-0 sudo[91456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amqoyenceapybyyumbyqcjpvgslifihj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261697.298362-656-46784470367312/AnsiballZ_command.py'
Feb 16 17:08:17 compute-0 sudo[91456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:17 compute-0 python3.9[91458]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:08:17 compute-0 ovs-vsctl[91459]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 16 17:08:17 compute-0 sudo[91456]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:18 compute-0 python3.9[91609]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:08:18 compute-0 sudo[91761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovqheyjeieyijvfzfoucyuevuaypxgir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261698.5319283-690-188334962227558/AnsiballZ_file.py'
Feb 16 17:08:18 compute-0 sudo[91761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:18 compute-0 python3.9[91763]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:18 compute-0 sudo[91761]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:19 compute-0 sudo[91913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmrbollxiwadcgyritlkpvspicuyjqvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261699.1215792-706-277104352630664/AnsiballZ_stat.py'
Feb 16 17:08:19 compute-0 sudo[91913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:19 compute-0 python3.9[91915]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:19 compute-0 sudo[91913]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:19 compute-0 sudo[91991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmmxipnvenomsaxbmapoqvghxxjubfew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261699.1215792-706-277104352630664/AnsiballZ_file.py'
Feb 16 17:08:19 compute-0 sudo[91991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:19 compute-0 python3.9[91993]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:19 compute-0 sudo[91991]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:20 compute-0 sudo[92143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azxvswtgewnpgloraunuekxryhzvlrer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261700.0967166-706-111433672185799/AnsiballZ_stat.py'
Feb 16 17:08:20 compute-0 sudo[92143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:20 compute-0 python3.9[92145]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:20 compute-0 sudo[92143]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:20 compute-0 sudo[92221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gltmcmqxprxnulgxmclwuranlbksbyzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261700.0967166-706-111433672185799/AnsiballZ_file.py'
Feb 16 17:08:20 compute-0 sudo[92221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:20 compute-0 python3.9[92223]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:20 compute-0 sudo[92221]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:21 compute-0 sudo[92373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opolwcxacyuongwujwrnbkwothrfjweh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261701.1254776-752-264517639526992/AnsiballZ_file.py'
Feb 16 17:08:21 compute-0 sudo[92373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:21 compute-0 python3.9[92375]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:21 compute-0 sudo[92373]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:21 compute-0 sudo[92525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksjyykicjpswlmrtgiwxffxxxeugzhck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261701.7365136-768-1110577661916/AnsiballZ_stat.py'
Feb 16 17:08:21 compute-0 sudo[92525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:22 compute-0 python3.9[92527]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:22 compute-0 sudo[92525]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:22 compute-0 sudo[92603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jegqabbytjkqyftaurqvrarddflltjsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261701.7365136-768-1110577661916/AnsiballZ_file.py'
Feb 16 17:08:22 compute-0 sudo[92603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:22 compute-0 python3.9[92605]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:22 compute-0 sudo[92603]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:22 compute-0 sudo[92755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwhiaifhvjeysmbubdabkmfyunrzqlhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261702.74298-792-180823449775080/AnsiballZ_stat.py'
Feb 16 17:08:22 compute-0 sudo[92755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:23 compute-0 python3.9[92757]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:23 compute-0 sudo[92755]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:23 compute-0 sudo[92833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onpzmhzkpuiswarlfssruvtijhcemcaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261702.74298-792-180823449775080/AnsiballZ_file.py'
Feb 16 17:08:23 compute-0 sudo[92833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:23 compute-0 python3.9[92835]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:23 compute-0 sudo[92833]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:24 compute-0 sudo[92985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nedcfrqrbnennlpscermlhtgfvxrupzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261703.8522809-816-176240070862863/AnsiballZ_systemd.py'
Feb 16 17:08:24 compute-0 sudo[92985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:24 compute-0 python3.9[92987]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:08:24 compute-0 systemd[1]: Reloading.
Feb 16 17:08:24 compute-0 systemd-rc-local-generator[93005]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:08:24 compute-0 systemd-sysv-generator[93011]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:08:24 compute-0 sudo[92985]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:25 compute-0 sudo[93182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ointofkyfvbcrlkbsxumdrydqpwwucbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261704.877494-832-269430774927625/AnsiballZ_stat.py'
Feb 16 17:08:25 compute-0 sudo[93182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:25 compute-0 python3.9[93184]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:25 compute-0 sudo[93182]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:25 compute-0 sudo[93260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upoylzplrossqdzhmkigmqbwqtpsumiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261704.877494-832-269430774927625/AnsiballZ_file.py'
Feb 16 17:08:25 compute-0 sudo[93260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:25 compute-0 python3.9[93262]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:25 compute-0 sudo[93260]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:26 compute-0 sudo[93412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipivsblmpinmvzolzwzrmabtgzyzcecm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261705.8937747-856-199454032247408/AnsiballZ_stat.py'
Feb 16 17:08:26 compute-0 sudo[93412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:26 compute-0 python3.9[93414]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:26 compute-0 sudo[93412]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:26 compute-0 sudo[93490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtwtggeqldtnezrofvsysipmyyoqrijr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261705.8937747-856-199454032247408/AnsiballZ_file.py'
Feb 16 17:08:26 compute-0 sudo[93490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:26 compute-0 python3.9[93492]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:26 compute-0 sudo[93490]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:27 compute-0 sudo[93642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nntaigcnsxcsolrtmcmomhfpklrqemms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261706.8956137-880-2292986790706/AnsiballZ_systemd.py'
Feb 16 17:08:27 compute-0 sudo[93642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:27 compute-0 python3.9[93644]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:08:27 compute-0 systemd[1]: Reloading.
Feb 16 17:08:27 compute-0 systemd-sysv-generator[93672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:08:27 compute-0 systemd-rc-local-generator[93667]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:08:27 compute-0 systemd[1]: Starting Create netns directory...
Feb 16 17:08:27 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 16 17:08:27 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 16 17:08:27 compute-0 systemd[1]: Finished Create netns directory.
Feb 16 17:08:27 compute-0 sudo[93642]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:28 compute-0 sudo[93842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkcccldhrwxnouywiihyqwjtzcwyfifj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261708.0861044-900-220542694262907/AnsiballZ_file.py'
Feb 16 17:08:28 compute-0 sudo[93842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:28 compute-0 python3.9[93844]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:28 compute-0 sudo[93842]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:29 compute-0 sudo[93994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcdxchmvxiarasjujkymeakaamrphone ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261708.747092-916-40611524888329/AnsiballZ_stat.py'
Feb 16 17:08:29 compute-0 sudo[93994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:29 compute-0 python3.9[93996]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:29 compute-0 sudo[93994]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:29 compute-0 sudo[94117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbmwkinqukkivfkvssjflmonfohhfsok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261708.747092-916-40611524888329/AnsiballZ_copy.py'
Feb 16 17:08:29 compute-0 sudo[94117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:29 compute-0 python3.9[94119]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261708.747092-916-40611524888329/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:29 compute-0 sudo[94117]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:30 compute-0 sudo[94269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrtqzcluwcuztwycbcanozytijjeznss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261710.219631-950-126914954550609/AnsiballZ_file.py'
Feb 16 17:08:30 compute-0 sudo[94269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:30 compute-0 python3.9[94271]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:30 compute-0 sudo[94269]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:31 compute-0 sudo[94421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqxskkbsvbxuniwpftpztqnezefprjpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261710.8458283-966-2916622166866/AnsiballZ_file.py'
Feb 16 17:08:31 compute-0 sudo[94421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:31 compute-0 python3.9[94423]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:31 compute-0 sudo[94421]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:31 compute-0 sudo[94573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkdkgygvsxjbunipncjtssbiotpicezr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261711.5953321-982-274125428779548/AnsiballZ_stat.py'
Feb 16 17:08:31 compute-0 sudo[94573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:32 compute-0 python3.9[94575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:32 compute-0 sudo[94573]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:32 compute-0 sudo[94696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgfwvizdnrvpulbacdhodrblydjqutrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261711.5953321-982-274125428779548/AnsiballZ_copy.py'
Feb 16 17:08:32 compute-0 sudo[94696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:32 compute-0 python3.9[94698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261711.5953321-982-274125428779548/.source.json _original_basename=.l1zfgj6k follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:32 compute-0 sudo[94696]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:33 compute-0 python3.9[94848]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:35 compute-0 sudo[95269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sklzjulmmhuapgspisoczwsmmnaajqpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261714.9065282-1062-249021709015387/AnsiballZ_container_config_data.py'
Feb 16 17:08:35 compute-0 sudo[95269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:35 compute-0 python3.9[95271]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 16 17:08:35 compute-0 sudo[95269]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:36 compute-0 sudo[95421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwdxslioozyokpmxervhhrmwoqhjbjkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261715.8325086-1084-41136144749309/AnsiballZ_container_config_hash.py'
Feb 16 17:08:36 compute-0 sudo[95421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:36 compute-0 python3.9[95423]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 17:08:36 compute-0 sudo[95421]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:37 compute-0 sudo[95573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uexfqfoqjahxdiuqkdcvhpfqfmnxpcel ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771261716.7784274-1104-213054176454878/AnsiballZ_edpm_container_manage.py'
Feb 16 17:08:37 compute-0 sudo[95573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:37 compute-0 python3[95575]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 17:08:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:08:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:08:37 compute-0 podman[95611]: 2026-02-16 17:08:37.680844607 +0000 UTC m=+0.049129997 container create 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 16 17:08:37 compute-0 podman[95611]: 2026-02-16 17:08:37.656389817 +0000 UTC m=+0.024675217 image pull 9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 16 17:08:37 compute-0 python3[95575]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 16 17:08:37 compute-0 sudo[95573]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:38 compute-0 sudo[95799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dghkkvrhteylwkmysluzjdnfktvxclej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261718.0252464-1120-162655709578219/AnsiballZ_stat.py'
Feb 16 17:08:38 compute-0 sudo[95799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:38 compute-0 python3.9[95801]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:08:38 compute-0 sudo[95799]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:38 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 16 17:08:39 compute-0 sudo[95953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojrgfjzwkxhyfwufgbholepynpuilmnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261718.703014-1138-263571140656016/AnsiballZ_file.py'
Feb 16 17:08:39 compute-0 sudo[95953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:39 compute-0 python3.9[95955]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:39 compute-0 sudo[95953]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:39 compute-0 sudo[96029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vckbmhdpthdfffykmsrcdxrltaottchx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261718.703014-1138-263571140656016/AnsiballZ_stat.py'
Feb 16 17:08:39 compute-0 sudo[96029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:39 compute-0 python3.9[96031]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:08:39 compute-0 sudo[96029]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:40 compute-0 sudo[96180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klevlqyejdvkffuvatsbpsvarapofkso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261719.7197483-1138-42341200189115/AnsiballZ_copy.py'
Feb 16 17:08:40 compute-0 sudo[96180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:40 compute-0 python3.9[96182]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771261719.7197483-1138-42341200189115/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:40 compute-0 sudo[96180]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:40 compute-0 sudo[96256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iitdxqdspvkgvolfzeoxqimvxazlyzdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261719.7197483-1138-42341200189115/AnsiballZ_systemd.py'
Feb 16 17:08:40 compute-0 sudo[96256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:40 compute-0 python3.9[96258]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:08:40 compute-0 systemd[1]: Reloading.
Feb 16 17:08:40 compute-0 systemd-rc-local-generator[96283]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:08:40 compute-0 systemd-sysv-generator[96286]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:08:41 compute-0 sudo[96256]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:41 compute-0 sudo[96373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djfsmstkbjoiklwvtxbkexunulntgwui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261719.7197483-1138-42341200189115/AnsiballZ_systemd.py'
Feb 16 17:08:41 compute-0 sudo[96373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:41 compute-0 python3.9[96375]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:08:41 compute-0 systemd[1]: Reloading.
Feb 16 17:08:41 compute-0 systemd-sysv-generator[96404]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:08:41 compute-0 systemd-rc-local-generator[96401]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:08:42 compute-0 systemd[1]: Starting ovn_controller container...
Feb 16 17:08:42 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Feb 16 17:08:42 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:08:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fdd2c6d1ef7ab54fb778c72f3dbea70d8bed1762769f9f5bb02cfff3b93fe61/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 16 17:08:42 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73.
Feb 16 17:08:42 compute-0 podman[96422]: 2026-02-16 17:08:42.240467437 +0000 UTC m=+0.144858056 container init 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 16 17:08:42 compute-0 ovn_controller[96437]: + sudo -E kolla_set_configs
Feb 16 17:08:42 compute-0 podman[96422]: 2026-02-16 17:08:42.27154135 +0000 UTC m=+0.175931929 container start 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 16 17:08:42 compute-0 edpm-start-podman-container[96422]: ovn_controller
Feb 16 17:08:42 compute-0 systemd[1]: Created slice User Slice of UID 0.
Feb 16 17:08:42 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 16 17:08:42 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 16 17:08:42 compute-0 systemd[1]: Starting User Manager for UID 0...
Feb 16 17:08:42 compute-0 systemd[96469]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Feb 16 17:08:42 compute-0 edpm-start-podman-container[96421]: Creating additional drop-in dependency for "ovn_controller" (6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73)
Feb 16 17:08:42 compute-0 podman[96444]: 2026-02-16 17:08:42.358697429 +0000 UTC m=+0.074687814 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 17:08:42 compute-0 systemd[1]: 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73-714d0439b934ec5c.service: Main process exited, code=exited, status=1/FAILURE
Feb 16 17:08:42 compute-0 systemd[1]: 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73-714d0439b934ec5c.service: Failed with result 'exit-code'.
Feb 16 17:08:42 compute-0 systemd[1]: Reloading.
Feb 16 17:08:42 compute-0 systemd[96469]: Queued start job for default target Main User Target.
Feb 16 17:08:42 compute-0 systemd[96469]: Created slice User Application Slice.
Feb 16 17:08:42 compute-0 systemd[96469]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 16 17:08:42 compute-0 systemd[96469]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 17:08:42 compute-0 systemd[96469]: Reached target Paths.
Feb 16 17:08:42 compute-0 systemd[96469]: Reached target Timers.
Feb 16 17:08:42 compute-0 systemd-rc-local-generator[96524]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:08:42 compute-0 systemd[96469]: Starting D-Bus User Message Bus Socket...
Feb 16 17:08:42 compute-0 systemd[96469]: Starting Create User's Volatile Files and Directories...
Feb 16 17:08:42 compute-0 systemd-sysv-generator[96528]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:08:42 compute-0 systemd[96469]: Finished Create User's Volatile Files and Directories.
Feb 16 17:08:42 compute-0 systemd[96469]: Listening on D-Bus User Message Bus Socket.
Feb 16 17:08:42 compute-0 systemd[96469]: Reached target Sockets.
Feb 16 17:08:42 compute-0 systemd[96469]: Reached target Basic System.
Feb 16 17:08:42 compute-0 systemd[96469]: Reached target Main User Target.
Feb 16 17:08:42 compute-0 systemd[96469]: Startup finished in 118ms.
Feb 16 17:08:42 compute-0 systemd[1]: Started User Manager for UID 0.
Feb 16 17:08:42 compute-0 systemd[1]: Started ovn_controller container.
Feb 16 17:08:42 compute-0 systemd[1]: Started Session c1 of User root.
Feb 16 17:08:42 compute-0 sudo[96373]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:42 compute-0 ovn_controller[96437]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 17:08:42 compute-0 ovn_controller[96437]: INFO:__main__:Validating config file
Feb 16 17:08:42 compute-0 ovn_controller[96437]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 17:08:42 compute-0 ovn_controller[96437]: INFO:__main__:Writing out command to execute
Feb 16 17:08:42 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Feb 16 17:08:42 compute-0 ovn_controller[96437]: ++ cat /run_command
Feb 16 17:08:42 compute-0 ovn_controller[96437]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 16 17:08:42 compute-0 ovn_controller[96437]: + ARGS=
Feb 16 17:08:42 compute-0 ovn_controller[96437]: + sudo kolla_copy_cacerts
Feb 16 17:08:42 compute-0 systemd[1]: Started Session c2 of User root.
Feb 16 17:08:42 compute-0 ovn_controller[96437]: + [[ ! -n '' ]]
Feb 16 17:08:42 compute-0 ovn_controller[96437]: + . kolla_extend_start
Feb 16 17:08:42 compute-0 ovn_controller[96437]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 16 17:08:42 compute-0 ovn_controller[96437]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Feb 16 17:08:42 compute-0 ovn_controller[96437]: + umask 0022
Feb 16 17:08:42 compute-0 ovn_controller[96437]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Feb 16 17:08:42 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 16 17:08:42 compute-0 NetworkManager[56463]: <info>  [1771261722.7780] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Feb 16 17:08:42 compute-0 NetworkManager[56463]: <info>  [1771261722.7788] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 16 17:08:42 compute-0 NetworkManager[56463]: <warn>  [1771261722.7792] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 16 17:08:42 compute-0 NetworkManager[56463]: <info>  [1771261722.7801] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Feb 16 17:08:42 compute-0 NetworkManager[56463]: <info>  [1771261722.7808] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Feb 16 17:08:42 compute-0 NetworkManager[56463]: <info>  [1771261722.7812] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 16 17:08:42 compute-0 kernel: br-int: entered promiscuous mode
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00014|main|INFO|OVS feature set changed, force recompute.
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00022|main|INFO|OVS feature set changed, force recompute.
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 17:08:42 compute-0 ovn_controller[96437]: 2026-02-16T17:08:42Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 16 17:08:42 compute-0 NetworkManager[56463]: <info>  [1771261722.7962] manager: (ovn-2e3a84-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Feb 16 17:08:42 compute-0 NetworkManager[56463]: <info>  [1771261722.7971] manager: (ovn-0be0c9-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Feb 16 17:08:42 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Feb 16 17:08:42 compute-0 NetworkManager[56463]: <info>  [1771261722.8145] device (genev_sys_6081): carrier: link connected
Feb 16 17:08:42 compute-0 NetworkManager[56463]: <info>  [1771261722.8148] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Feb 16 17:08:42 compute-0 systemd-udevd[96581]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:08:42 compute-0 systemd-udevd[96585]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:08:43 compute-0 python3.9[96713]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 17:08:44 compute-0 sudo[96863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toljkolebnhxtynimpcyjpoopssetqpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261723.903632-1228-97770657010936/AnsiballZ_stat.py'
Feb 16 17:08:44 compute-0 sudo[96863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:44 compute-0 python3.9[96865]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:08:44 compute-0 sudo[96863]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:44 compute-0 sudo[96986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qilonfkbmlhegbxytcugcviimuojjwam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261723.903632-1228-97770657010936/AnsiballZ_copy.py'
Feb 16 17:08:44 compute-0 sudo[96986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:44 compute-0 python3.9[96988]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261723.903632-1228-97770657010936/.source.yaml _original_basename=.my1vl5gc follow=False checksum=d1e0e07001414c4c103c3314c40139b58373e5ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:08:44 compute-0 sudo[96986]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:45 compute-0 sudo[97138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sajisjozpnwxkeeksjpniifgvjnoagoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261725.2276752-1258-128053718054546/AnsiballZ_command.py'
Feb 16 17:08:45 compute-0 sudo[97138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:45 compute-0 python3.9[97140]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:08:45 compute-0 ovs-vsctl[97141]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 16 17:08:45 compute-0 sudo[97138]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:46 compute-0 sudo[97291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exsfqzkbbephiimlydynqmecwsmcanwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261725.862357-1274-243947710013811/AnsiballZ_command.py'
Feb 16 17:08:46 compute-0 sudo[97291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:46 compute-0 python3.9[97293]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:08:46 compute-0 ovs-vsctl[97295]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 16 17:08:46 compute-0 sudo[97291]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:47 compute-0 sudo[97446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqqyxzuqdsphoceuaizpljedssqslahf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261726.7462633-1302-147731989489087/AnsiballZ_command.py'
Feb 16 17:08:47 compute-0 sudo[97446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:47 compute-0 python3.9[97448]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:08:47 compute-0 ovs-vsctl[97449]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 16 17:08:47 compute-0 sudo[97446]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:47 compute-0 sshd-session[85931]: Connection closed by 192.168.122.30 port 38586
Feb 16 17:08:47 compute-0 sshd-session[85928]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:08:47 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Feb 16 17:08:47 compute-0 systemd[1]: session-20.scope: Consumed 42.205s CPU time.
Feb 16 17:08:47 compute-0 systemd-logind[821]: Session 20 logged out. Waiting for processes to exit.
Feb 16 17:08:47 compute-0 systemd-logind[821]: Removed session 20.
Feb 16 17:08:52 compute-0 systemd[1]: Stopping User Manager for UID 0...
Feb 16 17:08:52 compute-0 systemd[96469]: Activating special unit Exit the Session...
Feb 16 17:08:52 compute-0 systemd[96469]: Stopped target Main User Target.
Feb 16 17:08:52 compute-0 systemd[96469]: Stopped target Basic System.
Feb 16 17:08:52 compute-0 systemd[96469]: Stopped target Paths.
Feb 16 17:08:52 compute-0 systemd[96469]: Stopped target Sockets.
Feb 16 17:08:52 compute-0 systemd[96469]: Stopped target Timers.
Feb 16 17:08:52 compute-0 systemd[96469]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 17:08:52 compute-0 systemd[96469]: Closed D-Bus User Message Bus Socket.
Feb 16 17:08:52 compute-0 systemd[96469]: Stopped Create User's Volatile Files and Directories.
Feb 16 17:08:52 compute-0 systemd[96469]: Removed slice User Application Slice.
Feb 16 17:08:52 compute-0 systemd[96469]: Reached target Shutdown.
Feb 16 17:08:52 compute-0 systemd[96469]: Finished Exit the Session.
Feb 16 17:08:52 compute-0 systemd[96469]: Reached target Exit the Session.
Feb 16 17:08:52 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Feb 16 17:08:52 compute-0 systemd[1]: Stopped User Manager for UID 0.
Feb 16 17:08:52 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 16 17:08:52 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 16 17:08:52 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 16 17:08:52 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 16 17:08:52 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Feb 16 17:08:53 compute-0 sshd-session[97476]: Accepted publickey for zuul from 192.168.122.30 port 54520 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:08:53 compute-0 systemd-logind[821]: New session 22 of user zuul.
Feb 16 17:08:53 compute-0 systemd[1]: Started Session 22 of User zuul.
Feb 16 17:08:53 compute-0 sshd-session[97476]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:08:54 compute-0 python3.9[97629]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:08:55 compute-0 sudo[97783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omntpjxjzrhuzuqnuxzijaypdwyhuvfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261734.8025556-48-114227831118407/AnsiballZ_file.py'
Feb 16 17:08:55 compute-0 sudo[97783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:55 compute-0 python3.9[97785]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:55 compute-0 sudo[97783]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:55 compute-0 sudo[97935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcxnnijvzwklwifiaervgqczyaymwnpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261735.5410676-48-217973391701483/AnsiballZ_file.py'
Feb 16 17:08:55 compute-0 sudo[97935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:56 compute-0 python3.9[97937]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:56 compute-0 sudo[97935]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:56 compute-0 sudo[98087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bizgltpphmmgwjjedifnrkiutdfnrrdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261736.170825-48-169341259685127/AnsiballZ_file.py'
Feb 16 17:08:56 compute-0 sudo[98087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:56 compute-0 python3.9[98089]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:56 compute-0 sudo[98087]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:57 compute-0 sudo[98239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhbbllhialpwtzzpgqrzbeprplawjaxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261736.8990614-48-250117330325862/AnsiballZ_file.py'
Feb 16 17:08:57 compute-0 sudo[98239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:57 compute-0 python3.9[98241]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:57 compute-0 sudo[98239]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:57 compute-0 sudo[98391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzuuhtcxxeoyorwmdqxuvykxlnapcusl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261737.4680824-48-260054813133973/AnsiballZ_file.py'
Feb 16 17:08:57 compute-0 sudo[98391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:57 compute-0 python3.9[98393]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:08:57 compute-0 sudo[98391]: pam_unix(sudo:session): session closed for user root
Feb 16 17:08:58 compute-0 python3.9[98543]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:08:59 compute-0 sudo[98693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgrwcdufboyyznwmvkzyydjpzkaqiqjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261738.7427979-136-35412267486677/AnsiballZ_seboolean.py'
Feb 16 17:08:59 compute-0 sudo[98693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:08:59 compute-0 python3.9[98695]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 16 17:08:59 compute-0 sudo[98693]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:00 compute-0 python3.9[98845]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:01 compute-0 python3.9[98967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261740.0413706-152-88969680535754/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:01 compute-0 python3.9[99117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:02 compute-0 python3.9[99238]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261741.4193764-182-91496961340300/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:02 compute-0 sudo[99388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxqtlxspfzcpvifzsgwchwefjboaxbzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261742.6600533-216-205058439189628/AnsiballZ_setup.py'
Feb 16 17:09:02 compute-0 sudo[99388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:03 compute-0 python3.9[99390]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:09:03 compute-0 sudo[99388]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:03 compute-0 sudo[99472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csvrwxpvyuqvxhswvihvzmqltgexubux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261742.6600533-216-205058439189628/AnsiballZ_dnf.py'
Feb 16 17:09:03 compute-0 sudo[99472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:04 compute-0 python3.9[99474]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:09:05 compute-0 sudo[99472]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:06 compute-0 sudo[99625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfjlebqsvenagicienljlxscobivawjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261745.6364594-240-247413430329911/AnsiballZ_systemd.py'
Feb 16 17:09:06 compute-0 sudo[99625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:06 compute-0 python3.9[99627]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 17:09:06 compute-0 sudo[99625]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:07 compute-0 python3.9[99780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:07 compute-0 python3.9[99901]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261746.8732526-256-150289538575060/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:08 compute-0 python3.9[100051]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:08 compute-0 python3.9[100172]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261748.0424814-256-92359025616388/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:10 compute-0 python3.9[100323]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:10 compute-0 python3.9[100444]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261750.0551112-344-49452658663072/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:11 compute-0 python3.9[100594]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:12 compute-0 python3.9[100715]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261751.095165-344-161140746107107/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:12 compute-0 ovn_controller[96437]: 2026-02-16T17:09:12Z|00025|memory|INFO|16128 kB peak resident set size after 29.7 seconds
Feb 16 17:09:12 compute-0 ovn_controller[96437]: 2026-02-16T17:09:12Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Feb 16 17:09:12 compute-0 podman[100839]: 2026-02-16 17:09:12.512456513 +0000 UTC m=+0.089005724 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:09:12 compute-0 python3.9[100875]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:09:13 compute-0 sudo[101044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvwhbfpguatprennvhmhfqtbyfcympdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261752.830984-420-106930218509288/AnsiballZ_file.py'
Feb 16 17:09:13 compute-0 sudo[101044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:13 compute-0 python3.9[101046]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:13 compute-0 sudo[101044]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:13 compute-0 sudo[101196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwtyhbqkuccqpbouakhfxiohwjxuokhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261753.6244903-436-92034834103015/AnsiballZ_stat.py'
Feb 16 17:09:13 compute-0 sudo[101196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:14 compute-0 python3.9[101198]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:14 compute-0 sudo[101196]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:14 compute-0 sudo[101274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqwasrnhmrhcbtgdtvpornoxsbircscf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261753.6244903-436-92034834103015/AnsiballZ_file.py'
Feb 16 17:09:14 compute-0 sudo[101274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:14 compute-0 python3.9[101276]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:14 compute-0 sudo[101274]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:15 compute-0 sudo[101426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyxrxhjidmipmyjwoqsxjajaweqmfpkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261754.7347898-436-115428576560052/AnsiballZ_stat.py'
Feb 16 17:09:15 compute-0 sudo[101426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:15 compute-0 python3.9[101428]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:15 compute-0 sudo[101426]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:15 compute-0 sudo[101504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktjlcjvgmljpdigawsxlqxfeigbjffjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261754.7347898-436-115428576560052/AnsiballZ_file.py'
Feb 16 17:09:15 compute-0 sudo[101504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:15 compute-0 python3.9[101506]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:15 compute-0 sudo[101504]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:15 compute-0 sudo[101656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbyxjgevzembnhieoqqsjetcqjsnlujg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261755.751058-482-204546270322834/AnsiballZ_file.py'
Feb 16 17:09:15 compute-0 sudo[101656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:16 compute-0 python3.9[101658]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:16 compute-0 sudo[101656]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:16 compute-0 sudo[101808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sihznyrljzyqqetkeghrpwhvbsnamqcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261756.3256-498-262774706274473/AnsiballZ_stat.py'
Feb 16 17:09:16 compute-0 sudo[101808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:16 compute-0 python3.9[101810]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:16 compute-0 sudo[101808]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:17 compute-0 sudo[101886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvzxaivsvsmlhphgawmrwjvfzwytyczy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261756.3256-498-262774706274473/AnsiballZ_file.py'
Feb 16 17:09:17 compute-0 sudo[101886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:17 compute-0 python3.9[101888]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:17 compute-0 sudo[101886]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:17 compute-0 sudo[102038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nviwnrahnhwpqgjlfhajfebmhxbupmhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261757.6350496-522-238713246050591/AnsiballZ_stat.py'
Feb 16 17:09:17 compute-0 sudo[102038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:18 compute-0 python3.9[102040]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:18 compute-0 sudo[102038]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:18 compute-0 sudo[102116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hoxajlkhrjqnegkijehghqjqlmugjfbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261757.6350496-522-238713246050591/AnsiballZ_file.py'
Feb 16 17:09:18 compute-0 sudo[102116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:18 compute-0 python3.9[102118]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:18 compute-0 sudo[102116]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:18 compute-0 sudo[102268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxtldlvojbqcwgbrxmohwfrmvsatezjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261758.681216-546-19237932871860/AnsiballZ_systemd.py'
Feb 16 17:09:18 compute-0 sudo[102268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:19 compute-0 python3.9[102270]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:09:19 compute-0 systemd[1]: Reloading.
Feb 16 17:09:19 compute-0 systemd-rc-local-generator[102299]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:09:19 compute-0 systemd-sysv-generator[102302]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:09:19 compute-0 sudo[102268]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:19 compute-0 sudo[102466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpgplqnrdgimvwpwixhcsfdgnnbcaofl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261759.6799877-562-61111657936035/AnsiballZ_stat.py'
Feb 16 17:09:19 compute-0 sudo[102466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:20 compute-0 python3.9[102468]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:20 compute-0 sudo[102466]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:20 compute-0 sudo[102544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrumrscmdsolgdjxqzsoorwgzsjlnywu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261759.6799877-562-61111657936035/AnsiballZ_file.py'
Feb 16 17:09:20 compute-0 sudo[102544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:20 compute-0 python3.9[102546]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:20 compute-0 sudo[102544]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:21 compute-0 sudo[102696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmpsignrbggjwollarrlvwvfyxyoviao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261760.7653716-586-277421594472625/AnsiballZ_stat.py'
Feb 16 17:09:21 compute-0 sudo[102696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:21 compute-0 python3.9[102698]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:21 compute-0 sudo[102696]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:21 compute-0 sudo[102774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wljonzlncacyxbjvlxcaxeffokcqivyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261760.7653716-586-277421594472625/AnsiballZ_file.py'
Feb 16 17:09:21 compute-0 sudo[102774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:21 compute-0 python3.9[102776]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:21 compute-0 sudo[102774]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:22 compute-0 sudo[102926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwngrdkhowyhzxyujizvbtqzjmqulosb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261761.7709541-610-197860255700456/AnsiballZ_systemd.py'
Feb 16 17:09:22 compute-0 sudo[102926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:22 compute-0 python3.9[102928]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:09:22 compute-0 systemd[1]: Reloading.
Feb 16 17:09:22 compute-0 systemd-rc-local-generator[102960]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:09:22 compute-0 systemd-sysv-generator[102966]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:09:22 compute-0 systemd[1]: Starting Create netns directory...
Feb 16 17:09:22 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 16 17:09:22 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 16 17:09:22 compute-0 systemd[1]: Finished Create netns directory.
Feb 16 17:09:22 compute-0 sudo[102926]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:23 compute-0 sudo[103127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phpfqiijuxuuzitpqgmkzvlqduwwwxxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261762.993402-630-188188372825144/AnsiballZ_file.py'
Feb 16 17:09:23 compute-0 sudo[103127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:23 compute-0 python3.9[103129]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:23 compute-0 sudo[103127]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:23 compute-0 sudo[103279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdszsjppfakumwguufajzrozsqmdixre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261763.5793924-646-113853739973587/AnsiballZ_stat.py'
Feb 16 17:09:23 compute-0 sudo[103279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:24 compute-0 python3.9[103281]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:24 compute-0 sudo[103279]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:24 compute-0 sudo[103402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiybobdoabirkpcjmqbagbbrtzuofyzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261763.5793924-646-113853739973587/AnsiballZ_copy.py'
Feb 16 17:09:24 compute-0 sudo[103402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:24 compute-0 python3.9[103404]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771261763.5793924-646-113853739973587/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:24 compute-0 sudo[103402]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:25 compute-0 sudo[103554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-safuwkiijvuoyuxsqakoszykfdtikhjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261764.897136-680-280406371136439/AnsiballZ_file.py'
Feb 16 17:09:25 compute-0 sudo[103554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:25 compute-0 python3.9[103556]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:25 compute-0 sudo[103554]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:25 compute-0 sudo[103706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pcoiwyaveboypdjdubccpsluhnxosgdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261765.5530083-696-72274400867239/AnsiballZ_file.py'
Feb 16 17:09:25 compute-0 sudo[103706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:25 compute-0 python3.9[103708]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:09:25 compute-0 sudo[103706]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:26 compute-0 sudo[103858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klxcfcvvnlsfmqhzammkrcztsvevjxpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261766.2029204-712-276135652841464/AnsiballZ_stat.py'
Feb 16 17:09:26 compute-0 sudo[103858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:26 compute-0 python3.9[103860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:26 compute-0 sudo[103858]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:26 compute-0 sudo[103981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znjvdisasgpudoxtvjmgeygleoyzjrmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261766.2029204-712-276135652841464/AnsiballZ_copy.py'
Feb 16 17:09:26 compute-0 sudo[103981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:27 compute-0 python3.9[103983]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261766.2029204-712-276135652841464/.source.json _original_basename=.f81_ikzy follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:27 compute-0 sudo[103981]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:27 compute-0 python3.9[104133]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:29 compute-0 sudo[104554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkvogxpbcvooamzolynytjzrfvhqosjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261769.2104316-792-93832584450492/AnsiballZ_container_config_data.py'
Feb 16 17:09:29 compute-0 sudo[104554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:29 compute-0 python3.9[104556]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 16 17:09:29 compute-0 sudo[104554]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:30 compute-0 sudo[104706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gynzgkyejvbnnloozytqhgvaasrsetwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261770.097903-814-8180386006483/AnsiballZ_container_config_hash.py'
Feb 16 17:09:30 compute-0 sudo[104706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:30 compute-0 python3.9[104708]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 17:09:30 compute-0 sudo[104706]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:31 compute-0 sudo[104858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wptcztfomhbinqhfrvaibaqqsgqdlion ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771261770.9823325-834-212035474891553/AnsiballZ_edpm_container_manage.py'
Feb 16 17:09:31 compute-0 sudo[104858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:31 compute-0 python3[104860]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 17:09:31 compute-0 podman[104898]: 2026-02-16 17:09:31.876386228 +0000 UTC m=+0.061852643 container create 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260127, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Feb 16 17:09:31 compute-0 podman[104898]: 2026-02-16 17:09:31.843478344 +0000 UTC m=+0.028944869 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:09:31 compute-0 python3[104860]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:09:32 compute-0 sudo[104858]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:32 compute-0 sudo[105084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gauulpxktgrsdwuxgoaqwhyavvxhgfjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261772.1745949-850-124795793975483/AnsiballZ_stat.py'
Feb 16 17:09:32 compute-0 sudo[105084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:32 compute-0 python3.9[105086]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:09:32 compute-0 sudo[105084]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:33 compute-0 sudo[105238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukearelucauzbltwhceadardbyhgefmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261772.981674-868-55576527105468/AnsiballZ_file.py'
Feb 16 17:09:33 compute-0 sudo[105238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:33 compute-0 python3.9[105240]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:33 compute-0 sudo[105238]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:33 compute-0 sudo[105314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibdptyqcuqmawemejwbgibpmvpqimuse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261772.981674-868-55576527105468/AnsiballZ_stat.py'
Feb 16 17:09:33 compute-0 sudo[105314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:33 compute-0 python3.9[105316]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:09:33 compute-0 sudo[105314]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:34 compute-0 sudo[105465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnjlfzphbkgkhfhkpjfoojixwltpxhgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261773.943323-868-266256940958376/AnsiballZ_copy.py'
Feb 16 17:09:34 compute-0 sudo[105465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:34 compute-0 python3.9[105467]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771261773.943323-868-266256940958376/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:34 compute-0 sudo[105465]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:34 compute-0 sudo[105541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqfpdelzlgzkeolqaxmmdzzkphtwkbgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261773.943323-868-266256940958376/AnsiballZ_systemd.py'
Feb 16 17:09:34 compute-0 sudo[105541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:35 compute-0 python3.9[105543]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:09:35 compute-0 systemd[1]: Reloading.
Feb 16 17:09:35 compute-0 systemd-rc-local-generator[105568]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:09:35 compute-0 systemd-sysv-generator[105571]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:09:35 compute-0 sudo[105541]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:35 compute-0 sudo[105659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybqnsgfpgtvruyfjsqefcrjvzfmtrlbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261773.943323-868-266256940958376/AnsiballZ_systemd.py'
Feb 16 17:09:35 compute-0 sudo[105659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:35 compute-0 python3.9[105661]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:09:35 compute-0 systemd[1]: Reloading.
Feb 16 17:09:36 compute-0 systemd-rc-local-generator[105692]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:09:36 compute-0 systemd-sysv-generator[105695]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:09:36 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Feb 16 17:09:36 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:09:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4ebf3f29d03def98e3ea47dbadcc923dc638473f402d28dadcf23db02a2e69b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 16 17:09:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4ebf3f29d03def98e3ea47dbadcc923dc638473f402d28dadcf23db02a2e69b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:09:36 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907.
Feb 16 17:09:36 compute-0 podman[105709]: 2026-02-16 17:09:36.320685499 +0000 UTC m=+0.138573803 container init 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: + sudo -E kolla_set_configs
Feb 16 17:09:36 compute-0 podman[105709]: 2026-02-16 17:09:36.345881648 +0000 UTC m=+0.163769932 container start 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 17:09:36 compute-0 edpm-start-podman-container[105709]: ovn_metadata_agent
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Validating config file
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Copying service configuration files
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Writing out command to execute
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: ++ cat /run_command
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: + CMD=neutron-ovn-metadata-agent
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: + ARGS=
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: + sudo kolla_copy_cacerts
Feb 16 17:09:36 compute-0 edpm-start-podman-container[105708]: Creating additional drop-in dependency for "ovn_metadata_agent" (216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907)
Feb 16 17:09:36 compute-0 podman[105732]: 2026-02-16 17:09:36.411094282 +0000 UTC m=+0.056036979 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: + [[ ! -n '' ]]
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: + . kolla_extend_start
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: Running command: 'neutron-ovn-metadata-agent'
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: + umask 0022
Feb 16 17:09:36 compute-0 ovn_metadata_agent[105725]: + exec neutron-ovn-metadata-agent
Feb 16 17:09:36 compute-0 systemd[1]: Reloading.
Feb 16 17:09:36 compute-0 systemd-rc-local-generator[105801]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:09:36 compute-0 systemd-sysv-generator[105804]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:09:36 compute-0 systemd[1]: Started ovn_metadata_agent container.
Feb 16 17:09:36 compute-0 sudo[105659]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:37 compute-0 python3.9[105967]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.074 105730 INFO neutron.common.config [-] Logging enabled!
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.074 105730 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.075 105730 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.075 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.075 105730 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.075 105730 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.076 105730 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.076 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.076 105730 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.076 105730 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.076 105730 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.076 105730 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.077 105730 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.077 105730 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.077 105730 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.077 105730 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.077 105730 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.077 105730 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.077 105730 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.078 105730 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.078 105730 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.078 105730 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.078 105730 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.078 105730 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.078 105730 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.078 105730 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.079 105730 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.079 105730 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.079 105730 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.079 105730 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.079 105730 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.079 105730 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.079 105730 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.080 105730 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.080 105730 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.080 105730 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.080 105730 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.080 105730 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.080 105730 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.081 105730 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.081 105730 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.081 105730 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.081 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.081 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.081 105730 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.082 105730 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.082 105730 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.082 105730 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.082 105730 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.082 105730 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.082 105730 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.082 105730 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.082 105730 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.083 105730 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.083 105730 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.083 105730 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.083 105730 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.083 105730 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.083 105730 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.083 105730 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.084 105730 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.084 105730 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.084 105730 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.084 105730 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.084 105730 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.084 105730 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.085 105730 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.085 105730 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.085 105730 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.085 105730 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.085 105730 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.085 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.085 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.086 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.086 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.086 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.086 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.086 105730 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.086 105730 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.086 105730 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.087 105730 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.087 105730 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.087 105730 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.087 105730 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.087 105730 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.087 105730 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.087 105730 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.088 105730 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.088 105730 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.088 105730 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.088 105730 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.088 105730 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.088 105730 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.088 105730 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.089 105730 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.089 105730 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.089 105730 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.089 105730 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.089 105730 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.089 105730 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.089 105730 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.089 105730 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.090 105730 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.090 105730 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.090 105730 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.090 105730 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.090 105730 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.090 105730 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.091 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.091 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.091 105730 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.091 105730 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.091 105730 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.091 105730 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.091 105730 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.092 105730 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.092 105730 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.092 105730 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.092 105730 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.092 105730 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.092 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.092 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.093 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.093 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.093 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.093 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.093 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.093 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.094 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.094 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.094 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.094 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.094 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.094 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 sudo[106117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smfogckwihljamdewigfjitmrokchnza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261777.858871-958-213102879989722/AnsiballZ_stat.py'
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.094 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.095 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.095 105730 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.095 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.095 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.095 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.095 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.095 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.096 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.096 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.096 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.096 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.096 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.096 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.096 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.096 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.097 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.097 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.097 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 sudo[106117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.097 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.097 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.097 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.097 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.098 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.098 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.098 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.098 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.098 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.098 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.098 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.098 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.099 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.099 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.099 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.099 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.099 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.099 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.099 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.100 105730 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.100 105730 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.100 105730 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.100 105730 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.100 105730 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.100 105730 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.100 105730 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.101 105730 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.101 105730 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.101 105730 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.101 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.101 105730 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.101 105730 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.101 105730 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.102 105730 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.102 105730 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.102 105730 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.102 105730 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.102 105730 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.103 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.103 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.103 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.103 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.103 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.103 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.104 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.104 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.104 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.104 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.104 105730 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.104 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.105 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.105 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.105 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.105 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.105 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.105 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.105 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.106 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.106 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.106 105730 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.106 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.106 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.106 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.106 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.107 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.107 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.107 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.107 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.107 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.107 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.107 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.108 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.108 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.108 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.108 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.108 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.108 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.108 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.109 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.109 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.109 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.109 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.109 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.109 105730 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.109 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.110 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.110 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.110 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.110 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.110 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.110 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.110 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.111 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.111 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.111 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.111 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.111 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.111 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.111 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.112 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.112 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.112 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.112 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.112 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.112 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.112 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.113 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.113 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.113 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.113 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.113 105730 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.113 105730 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.113 105730 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.114 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.114 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.114 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.114 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.114 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.114 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.114 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.115 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.115 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.115 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.115 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.115 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.115 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.115 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.116 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.116 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.116 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.116 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.116 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.116 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.117 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.117 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.117 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.117 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.117 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.117 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.117 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.118 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.118 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.118 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.118 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.118 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.118 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.118 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.119 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.119 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.119 105730 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.119 105730 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.130 105730 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.131 105730 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.131 105730 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.131 105730 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.132 105730 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.148 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 09f26141-c730-49d9-ad1c-7063ea4246fa (UUID: 09f26141-c730-49d9-ad1c-7063ea4246fa) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.175 105730 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.176 105730 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.176 105730 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.176 105730 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.179 105730 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.186 105730 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.192 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '09f26141-c730-49d9-ad1c-7063ea4246fa'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], external_ids={}, name=09f26141-c730-49d9-ad1c-7063ea4246fa, nb_cfg_timestamp=1771261730790, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.193 105730 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f415cd65a30>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.194 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.194 105730 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.194 105730 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.195 105730 INFO oslo_service.service [-] Starting 1 workers
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.198 105730 DEBUG oslo_service.service [-] Started child 106120 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.201 105730 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpc4aexurc/privsep.sock']
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.201 106120 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-179595'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.223 106120 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.223 106120 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.223 106120 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.227 106120 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.233 106120 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.239 106120 INFO eventlet.wsgi.server [-] (106120) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 16 17:09:38 compute-0 python3.9[106119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:09:38 compute-0 sudo[106117]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:38 compute-0 sudo[106247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udachmtfubvyhfmmqthrvbmrouxqyhot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261777.858871-958-213102879989722/AnsiballZ_copy.py'
Feb 16 17:09:38 compute-0 sudo[106247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:38 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 16 17:09:38 compute-0 python3.9[106249]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771261777.858871-958-213102879989722/.source.yaml _original_basename=.u81prxbe follow=False checksum=f04108ec6687424cb255c58556677c82819c163f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:38 compute-0 sudo[106247]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.849 105730 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.850 105730 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpc4aexurc/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.699 106250 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.706 106250 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.710 106250 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.710 106250 INFO oslo.privsep.daemon [-] privsep daemon running as pid 106250
Feb 16 17:09:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:38.853 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[fb28804d-6104-48dc-b523-5c12e222692b]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:09:39 compute-0 sshd-session[97479]: Connection closed by 192.168.122.30 port 54520
Feb 16 17:09:39 compute-0 sshd-session[97476]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:09:39 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Feb 16 17:09:39 compute-0 systemd[1]: session-22.scope: Consumed 32.549s CPU time.
Feb 16 17:09:39 compute-0 systemd-logind[821]: Session 22 logged out. Waiting for processes to exit.
Feb 16 17:09:39 compute-0 systemd-logind[821]: Removed session 22.
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.370 106250 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.370 106250 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.370 106250 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.890 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[63af5b27-f8cf-4933-b40d-3b12e211691b]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.893 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, column=external_ids, values=({'neutron:ovn-metadata-id': 'e663d5c6-677b-575d-8fce-2c5bd5fb8894'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.904 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.911 105730 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.912 105730 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.912 105730 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.912 105730 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.912 105730 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.913 105730 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.913 105730 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.913 105730 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.914 105730 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.914 105730 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.915 105730 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.915 105730 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.915 105730 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.916 105730 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.916 105730 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.916 105730 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.917 105730 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.917 105730 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.917 105730 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.918 105730 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.918 105730 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.918 105730 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.919 105730 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.919 105730 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.919 105730 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.920 105730 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.920 105730 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.920 105730 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.921 105730 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.921 105730 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.921 105730 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.921 105730 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.922 105730 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.922 105730 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.922 105730 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.922 105730 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.923 105730 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.923 105730 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.923 105730 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.924 105730 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.924 105730 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.924 105730 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.925 105730 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.925 105730 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.925 105730 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.925 105730 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.926 105730 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.926 105730 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.926 105730 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.926 105730 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.927 105730 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.927 105730 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.927 105730 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.927 105730 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.928 105730 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.928 105730 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.928 105730 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.929 105730 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.929 105730 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.929 105730 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.930 105730 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.930 105730 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.930 105730 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.931 105730 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.931 105730 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.931 105730 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.932 105730 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.932 105730 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.932 105730 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.933 105730 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.933 105730 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.933 105730 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.934 105730 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.934 105730 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.934 105730 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.935 105730 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.935 105730 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.935 105730 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.936 105730 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.936 105730 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.936 105730 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.936 105730 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.937 105730 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.937 105730 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.937 105730 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.938 105730 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.938 105730 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.938 105730 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.939 105730 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.939 105730 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.939 105730 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.940 105730 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.940 105730 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.940 105730 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.940 105730 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.941 105730 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.941 105730 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.941 105730 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.942 105730 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.942 105730 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.942 105730 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.942 105730 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.942 105730 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.942 105730 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.942 105730 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.943 105730 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.943 105730 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.943 105730 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.943 105730 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.943 105730 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.943 105730 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.943 105730 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.944 105730 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.944 105730 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.944 105730 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.944 105730 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.944 105730 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.944 105730 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.944 105730 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.945 105730 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.945 105730 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.945 105730 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.945 105730 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.945 105730 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.945 105730 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.946 105730 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.946 105730 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.946 105730 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.946 105730 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.946 105730 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.946 105730 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.946 105730 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.947 105730 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.947 105730 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.947 105730 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.947 105730 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.947 105730 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.947 105730 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.947 105730 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.948 105730 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.948 105730 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.948 105730 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.948 105730 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.948 105730 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.948 105730 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.948 105730 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.949 105730 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.949 105730 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.949 105730 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.949 105730 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.949 105730 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.949 105730 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.949 105730 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.949 105730 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.950 105730 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.950 105730 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.950 105730 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.950 105730 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.950 105730 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.950 105730 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.950 105730 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.950 105730 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.951 105730 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.951 105730 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.951 105730 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.951 105730 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.951 105730 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.951 105730 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.951 105730 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.951 105730 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.952 105730 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.952 105730 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.952 105730 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.952 105730 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.952 105730 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.952 105730 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.952 105730 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.953 105730 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.953 105730 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.953 105730 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.953 105730 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.953 105730 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.953 105730 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.953 105730 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.954 105730 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.954 105730 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.954 105730 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.954 105730 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.954 105730 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.954 105730 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.954 105730 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.954 105730 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.955 105730 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.955 105730 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.955 105730 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.955 105730 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.955 105730 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.955 105730 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.955 105730 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.956 105730 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.956 105730 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.956 105730 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.956 105730 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.956 105730 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.956 105730 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.956 105730 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.956 105730 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.957 105730 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.957 105730 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.957 105730 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.957 105730 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.957 105730 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.957 105730 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.957 105730 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.958 105730 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.958 105730 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.958 105730 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.958 105730 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.958 105730 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.958 105730 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.958 105730 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.958 105730 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.959 105730 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.959 105730 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.959 105730 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.959 105730 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.959 105730 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.959 105730 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.959 105730 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.959 105730 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.960 105730 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.960 105730 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.960 105730 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.960 105730 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.960 105730 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.960 105730 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.960 105730 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.961 105730 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.961 105730 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.961 105730 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.961 105730 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.961 105730 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.961 105730 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.961 105730 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.961 105730 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.962 105730 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.962 105730 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.962 105730 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.962 105730 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.962 105730 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.962 105730 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.962 105730 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.963 105730 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.963 105730 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.963 105730 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.963 105730 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.963 105730 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.963 105730 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.963 105730 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.963 105730 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.964 105730 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.964 105730 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.964 105730 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.964 105730 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.964 105730 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.964 105730 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.964 105730 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.964 105730 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.965 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.965 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.965 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.965 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.965 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.965 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.965 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.966 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.966 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.966 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.966 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.966 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.966 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.966 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.967 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.967 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.967 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.967 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.967 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.967 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.967 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.967 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.968 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.968 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.968 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.968 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.968 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.968 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.968 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.969 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.969 105730 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.969 105730 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.969 105730 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.969 105730 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.969 105730 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:09:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:09:39.969 105730 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 17:09:43 compute-0 podman[106279]: 2026-02-16 17:09:43.111754139 +0000 UTC m=+0.084030514 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:09:44 compute-0 sshd-session[106306]: Accepted publickey for zuul from 192.168.122.30 port 53704 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:09:44 compute-0 systemd-logind[821]: New session 23 of user zuul.
Feb 16 17:09:44 compute-0 systemd[1]: Started Session 23 of User zuul.
Feb 16 17:09:44 compute-0 sshd-session[106306]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:09:45 compute-0 python3.9[106459]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:09:46 compute-0 sudo[106613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzbqdrgbboarsjrqzxaoecwfsfjyverb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261786.0513654-48-225097252150482/AnsiballZ_command.py'
Feb 16 17:09:46 compute-0 sudo[106613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:46 compute-0 python3.9[106615]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:09:46 compute-0 sudo[106613]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:47 compute-0 sudo[106778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvyvwwcrprnxkkgxidphhhxcdkefvbmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261787.0386312-70-161226023757105/AnsiballZ_systemd_service.py'
Feb 16 17:09:47 compute-0 sudo[106778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:47 compute-0 python3.9[106780]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:09:47 compute-0 systemd[1]: Reloading.
Feb 16 17:09:48 compute-0 systemd-sysv-generator[106811]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:09:48 compute-0 systemd-rc-local-generator[106803]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:09:48 compute-0 sudo[106778]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:48 compute-0 python3.9[106972]: ansible-ansible.builtin.service_facts Invoked
Feb 16 17:09:48 compute-0 network[106989]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 17:09:48 compute-0 network[106990]: 'network-scripts' will be removed from distribution in near future.
Feb 16 17:09:48 compute-0 network[106991]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 17:09:51 compute-0 sudo[107251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqyxudpqkljqbxljtvwyynmictlyxlzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261791.5188513-108-174842482287754/AnsiballZ_systemd_service.py'
Feb 16 17:09:51 compute-0 sudo[107251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:52 compute-0 python3.9[107253]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:09:52 compute-0 sudo[107251]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:52 compute-0 sudo[107404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfvweqaygmxcghorwkfirdvoahrymogw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261792.255869-108-18279448874495/AnsiballZ_systemd_service.py'
Feb 16 17:09:52 compute-0 sudo[107404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:52 compute-0 python3.9[107406]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:09:52 compute-0 sudo[107404]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:53 compute-0 sudo[107557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqcwgveyyszdmwejwhkwensiroqdfswn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261792.9814968-108-169123430766950/AnsiballZ_systemd_service.py'
Feb 16 17:09:53 compute-0 sudo[107557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:53 compute-0 python3.9[107559]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:09:53 compute-0 sudo[107557]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:53 compute-0 sudo[107710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrqsfvjxvnnkaguzkhxqlakimizsujbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261793.66396-108-160632338302524/AnsiballZ_systemd_service.py'
Feb 16 17:09:53 compute-0 sudo[107710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:54 compute-0 python3.9[107712]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:09:54 compute-0 sudo[107710]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:54 compute-0 sudo[107863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsfdmnyrdlscjfrumcelppyxtyehzdol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261794.319476-108-164192680287257/AnsiballZ_systemd_service.py'
Feb 16 17:09:54 compute-0 sudo[107863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:54 compute-0 python3.9[107865]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:09:54 compute-0 sudo[107863]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:55 compute-0 sudo[108016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrmqvequyjpqsqzzpwralzylnduvumzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261795.0316305-108-271535945113859/AnsiballZ_systemd_service.py'
Feb 16 17:09:55 compute-0 sudo[108016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:55 compute-0 python3.9[108018]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:09:55 compute-0 sudo[108016]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:56 compute-0 sudo[108169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yohvombwguosjqozdbayglzoajvpvqti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261795.7900949-108-95615180576464/AnsiballZ_systemd_service.py'
Feb 16 17:09:56 compute-0 sudo[108169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:56 compute-0 python3.9[108171]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:09:56 compute-0 sudo[108169]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:57 compute-0 sudo[108322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zszucdtzmcziqjdfxherqzurmcudpmhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261796.6878977-212-242738494938849/AnsiballZ_file.py'
Feb 16 17:09:57 compute-0 sudo[108322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:57 compute-0 python3.9[108324]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:57 compute-0 sudo[108322]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:57 compute-0 sudo[108474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwkzosqnlewczkihfcifuwuwgvcjhsuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261797.411038-212-12773803807001/AnsiballZ_file.py'
Feb 16 17:09:57 compute-0 sudo[108474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:57 compute-0 python3.9[108476]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:57 compute-0 sudo[108474]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:58 compute-0 sudo[108626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsdywyycsdihkncmupbtlojugoeotfmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261797.998154-212-255639060968562/AnsiballZ_file.py'
Feb 16 17:09:58 compute-0 sudo[108626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:58 compute-0 python3.9[108628]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:58 compute-0 sudo[108626]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:58 compute-0 sudo[108778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oobhhbpfaxbisofbkohowpjooaohrnyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261798.591022-212-13650107258204/AnsiballZ_file.py'
Feb 16 17:09:58 compute-0 sudo[108778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:58 compute-0 python3.9[108780]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:58 compute-0 sudo[108778]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:59 compute-0 sudo[108930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aydcqjuyjslkpvzmmqxqzcbtwjgtapbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261799.074905-212-187968872757818/AnsiballZ_file.py'
Feb 16 17:09:59 compute-0 sudo[108930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:09:59 compute-0 python3.9[108932]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:09:59 compute-0 sudo[108930]: pam_unix(sudo:session): session closed for user root
Feb 16 17:09:59 compute-0 sudo[109082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhzxkmthqduzzkgsmkyraghxlrppybtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261799.5981028-212-18854733235710/AnsiballZ_file.py'
Feb 16 17:09:59 compute-0 sudo[109082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:00 compute-0 python3.9[109084]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:10:00 compute-0 sudo[109082]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:00 compute-0 sudo[109234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrdinkdyjhwwixbtrltrrfgitjjcgooc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261800.1875062-212-112938831141218/AnsiballZ_file.py'
Feb 16 17:10:00 compute-0 sudo[109234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:00 compute-0 python3.9[109236]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:10:00 compute-0 sudo[109234]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:01 compute-0 sudo[109386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qilqozaabrvqzxryptmnqskabjepqqet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261800.8693104-312-54591744706390/AnsiballZ_file.py'
Feb 16 17:10:01 compute-0 sudo[109386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:01 compute-0 python3.9[109388]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:10:01 compute-0 sudo[109386]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:01 compute-0 sudo[109538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqwzkwquhqpcjyihdnuhdcjyhktsljqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261801.4473343-312-262646222299443/AnsiballZ_file.py'
Feb 16 17:10:01 compute-0 sudo[109538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:01 compute-0 python3.9[109540]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:10:01 compute-0 sudo[109538]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:02 compute-0 sudo[109690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzvqrqgskyrjmdyjihpqsztmjrnmpuja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261801.9615092-312-264844194262275/AnsiballZ_file.py'
Feb 16 17:10:02 compute-0 sudo[109690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:02 compute-0 python3.9[109692]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:10:02 compute-0 sudo[109690]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:02 compute-0 sudo[109842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrvvgaftyjazljrfwqeisowghiqvicds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261802.5331075-312-165465476240922/AnsiballZ_file.py'
Feb 16 17:10:02 compute-0 sudo[109842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:02 compute-0 python3.9[109844]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:10:02 compute-0 sudo[109842]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:03 compute-0 sudo[109994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojuvnjcrpuemlvbuitlhbdulozzibihg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261803.096563-312-70589546162417/AnsiballZ_file.py'
Feb 16 17:10:03 compute-0 sudo[109994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:03 compute-0 python3.9[109996]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:10:03 compute-0 sudo[109994]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:03 compute-0 sudo[110146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwhtxfanqmgfgqajvsujjpfvoenhfyge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261803.6511598-312-45050883501620/AnsiballZ_file.py'
Feb 16 17:10:03 compute-0 sudo[110146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:04 compute-0 python3.9[110148]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:10:04 compute-0 sudo[110146]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:04 compute-0 sudo[110298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssnelywbtihqlwwcgqfloltggblpcypb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261804.1454694-312-151191958374192/AnsiballZ_file.py'
Feb 16 17:10:04 compute-0 sudo[110298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:04 compute-0 python3.9[110300]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:10:04 compute-0 sudo[110298]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:05 compute-0 sudo[110450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tluqcqifvdursbngyyqvnmcnxjheozzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261804.889831-414-71266589766634/AnsiballZ_command.py'
Feb 16 17:10:05 compute-0 sudo[110450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:05 compute-0 python3.9[110452]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:10:05 compute-0 sudo[110450]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:06 compute-0 python3.9[110604]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 17:10:06 compute-0 sudo[110764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vshrvscjyltejszhjmqzdughfdwvaotn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261806.3233852-450-41172383472877/AnsiballZ_systemd_service.py'
Feb 16 17:10:06 compute-0 sudo[110764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:06 compute-0 podman[110728]: 2026-02-16 17:10:06.627686505 +0000 UTC m=+0.068245161 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 17:10:06 compute-0 python3.9[110769]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:10:06 compute-0 systemd[1]: Reloading.
Feb 16 17:10:07 compute-0 systemd-rc-local-generator[110799]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:10:07 compute-0 systemd-sysv-generator[110808]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:10:07 compute-0 sudo[110764]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:07 compute-0 sudo[110968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpheczzssqqrumafuwprxnhcdrriggyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261807.6083753-466-235940326434302/AnsiballZ_command.py'
Feb 16 17:10:07 compute-0 sudo[110968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:07 compute-0 python3.9[110970]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:10:07 compute-0 sudo[110968]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:08 compute-0 sudo[111121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgvwpdwjqyvjbgnqljaqizqscsozdrvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261808.0905306-466-43131379905696/AnsiballZ_command.py'
Feb 16 17:10:08 compute-0 sudo[111121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:08 compute-0 python3.9[111123]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:10:08 compute-0 sudo[111121]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:08 compute-0 sudo[111274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmsfabjhlbhcyxrhnfhqbzshuacjduhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261808.6148887-466-64803500947351/AnsiballZ_command.py'
Feb 16 17:10:08 compute-0 sudo[111274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:09 compute-0 python3.9[111276]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:10:09 compute-0 sudo[111274]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:09 compute-0 sudo[111427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfxrvqfoulnbndukacdmitwylddeiplb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261809.1663685-466-224721819439915/AnsiballZ_command.py'
Feb 16 17:10:09 compute-0 sudo[111427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:09 compute-0 python3.9[111429]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:10:09 compute-0 sudo[111427]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:09 compute-0 sudo[111580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbqllpotxlwajghjmaesscommrjncwda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261809.748555-466-141436685526790/AnsiballZ_command.py'
Feb 16 17:10:09 compute-0 sudo[111580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:10 compute-0 python3.9[111582]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:10:10 compute-0 sudo[111580]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:10 compute-0 sudo[111733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eihosvgiypmkbivbdrxahvqwfvoutfri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261810.3340073-466-141412637395939/AnsiballZ_command.py'
Feb 16 17:10:10 compute-0 sudo[111733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:10 compute-0 python3.9[111735]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:10:10 compute-0 sudo[111733]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:11 compute-0 sudo[111886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkcxfnohvposcgdpppifgpqodlcvafdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261810.8960106-466-199418272730225/AnsiballZ_command.py'
Feb 16 17:10:11 compute-0 sudo[111886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:11 compute-0 python3.9[111888]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:10:11 compute-0 sudo[111886]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:12 compute-0 sudo[112039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxqrcdpxfharsmbjhjuboyfetpjkbhpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261811.686735-574-225051091534810/AnsiballZ_getent.py'
Feb 16 17:10:12 compute-0 sudo[112039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:12 compute-0 python3.9[112041]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 16 17:10:12 compute-0 sudo[112039]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:12 compute-0 sudo[112192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruqlrfsynbojsccurruyqokslbxomotl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261812.4637723-590-76719137875900/AnsiballZ_group.py'
Feb 16 17:10:12 compute-0 sudo[112192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:13 compute-0 python3.9[112194]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 17:10:13 compute-0 groupadd[112195]: group added to /etc/group: name=libvirt, GID=42473
Feb 16 17:10:13 compute-0 groupadd[112195]: group added to /etc/gshadow: name=libvirt
Feb 16 17:10:13 compute-0 groupadd[112195]: new group: name=libvirt, GID=42473
Feb 16 17:10:13 compute-0 sudo[112192]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:13 compute-0 sudo[112361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjzlqghndocuuztvhninyybswqqyivhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261813.284114-606-16259656655860/AnsiballZ_user.py'
Feb 16 17:10:13 compute-0 sudo[112361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:13 compute-0 podman[112324]: 2026-02-16 17:10:13.829156045 +0000 UTC m=+0.122383756 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 17:10:14 compute-0 python3.9[112369]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 17:10:14 compute-0 useradd[112381]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 17:10:14 compute-0 sudo[112361]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:14 compute-0 sudo[112537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdyfjkxihsvfxomaentnezyqlmcnybru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261814.478524-628-8945602680742/AnsiballZ_setup.py'
Feb 16 17:10:14 compute-0 sudo[112537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:14 compute-0 python3.9[112539]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:10:15 compute-0 sudo[112537]: pam_unix(sudo:session): session closed for user root
Feb 16 17:10:15 compute-0 sudo[112621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uovatmuythziqyaxqaopiwdlwcegnecu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261814.478524-628-8945602680742/AnsiballZ_dnf.py'
Feb 16 17:10:15 compute-0 sudo[112621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:10:15 compute-0 python3.9[112623]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:10:37 compute-0 podman[112673]: 2026-02-16 17:10:37.131457579 +0000 UTC m=+0.098981262 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 16 17:10:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:10:38.134 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:10:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:10:38.135 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:10:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:10:38.135 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:10:44 compute-0 podman[112847]: 2026-02-16 17:10:44.105072842 +0000 UTC m=+0.077256133 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:10:58 compute-0 kernel: SELinux:  Converting 2766 SID table entries...
Feb 16 17:10:58 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 17:10:58 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 17:10:58 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 17:10:58 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 17:10:58 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 17:10:58 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 17:10:58 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 17:11:08 compute-0 dbus-broker-launch[807]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Feb 16 17:11:08 compute-0 podman[112905]: 2026-02-16 17:11:08.252908332 +0000 UTC m=+0.098206479 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 16 17:11:10 compute-0 kernel: SELinux:  Converting 2766 SID table entries...
Feb 16 17:11:10 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 17:11:10 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 17:11:10 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 17:11:10 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 17:11:10 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 17:11:10 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 17:11:10 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 17:11:15 compute-0 dbus-broker-launch[807]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 16 17:11:15 compute-0 podman[112931]: 2026-02-16 17:11:15.141702072 +0000 UTC m=+0.096001681 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 17:11:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:11:38.135 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:11:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:11:38.136 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:11:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:11:38.136 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:11:39 compute-0 podman[126690]: 2026-02-16 17:11:39.073814127 +0000 UTC m=+0.048849584 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 17:11:46 compute-0 podman[129872]: 2026-02-16 17:11:46.100147601 +0000 UTC m=+0.072419078 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 16 17:11:56 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Feb 16 17:11:56 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 16 17:11:56 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 16 17:11:56 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 16 17:11:56 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 16 17:11:56 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 16 17:11:56 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 16 17:11:56 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 16 17:11:57 compute-0 groupadd[129912]: group added to /etc/group: name=dnsmasq, GID=993
Feb 16 17:11:57 compute-0 groupadd[129912]: group added to /etc/gshadow: name=dnsmasq
Feb 16 17:11:57 compute-0 groupadd[129912]: new group: name=dnsmasq, GID=993
Feb 16 17:11:57 compute-0 useradd[129919]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 16 17:11:57 compute-0 dbus-broker-launch[797]: Noticed file-system modification, trigger reload.
Feb 16 17:11:57 compute-0 dbus-broker-launch[807]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 16 17:11:57 compute-0 dbus-broker-launch[797]: Noticed file-system modification, trigger reload.
Feb 16 17:11:58 compute-0 groupadd[129932]: group added to /etc/group: name=clevis, GID=992
Feb 16 17:11:58 compute-0 groupadd[129932]: group added to /etc/gshadow: name=clevis
Feb 16 17:11:58 compute-0 groupadd[129932]: new group: name=clevis, GID=992
Feb 16 17:11:58 compute-0 useradd[129939]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 16 17:11:58 compute-0 usermod[129949]: add 'clevis' to group 'tss'
Feb 16 17:11:58 compute-0 usermod[129949]: add 'clevis' to shadow group 'tss'
Feb 16 17:12:00 compute-0 polkitd[44497]: Reloading rules
Feb 16 17:12:00 compute-0 polkitd[44497]: Collecting garbage unconditionally...
Feb 16 17:12:00 compute-0 polkitd[44497]: Loading rules from directory /etc/polkit-1/rules.d
Feb 16 17:12:00 compute-0 polkitd[44497]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 16 17:12:00 compute-0 polkitd[44497]: Finished loading, compiling and executing 3 rules
Feb 16 17:12:00 compute-0 polkitd[44497]: Reloading rules
Feb 16 17:12:00 compute-0 polkitd[44497]: Collecting garbage unconditionally...
Feb 16 17:12:00 compute-0 polkitd[44497]: Loading rules from directory /etc/polkit-1/rules.d
Feb 16 17:12:00 compute-0 polkitd[44497]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 16 17:12:00 compute-0 polkitd[44497]: Finished loading, compiling and executing 3 rules
Feb 16 17:12:01 compute-0 groupadd[130139]: group added to /etc/group: name=ceph, GID=167
Feb 16 17:12:01 compute-0 groupadd[130139]: group added to /etc/gshadow: name=ceph
Feb 16 17:12:01 compute-0 groupadd[130139]: new group: name=ceph, GID=167
Feb 16 17:12:01 compute-0 useradd[130145]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 16 17:12:04 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Feb 16 17:12:04 compute-0 sshd[1021]: Received signal 15; terminating.
Feb 16 17:12:04 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Feb 16 17:12:04 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Feb 16 17:12:04 compute-0 systemd[1]: sshd.service: Consumed 1.395s CPU time, read 564.0K from disk, written 0B to disk.
Feb 16 17:12:04 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Feb 16 17:12:04 compute-0 systemd[1]: Stopping sshd-keygen.target...
Feb 16 17:12:04 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 17:12:04 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 17:12:04 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 16 17:12:04 compute-0 systemd[1]: Reached target sshd-keygen.target.
Feb 16 17:12:04 compute-0 systemd[1]: Starting OpenSSH server daemon...
Feb 16 17:12:04 compute-0 sshd[130664]: Server listening on 0.0.0.0 port 22.
Feb 16 17:12:04 compute-0 sshd[130664]: Server listening on :: port 22.
Feb 16 17:12:04 compute-0 systemd[1]: Started OpenSSH server daemon.
Feb 16 17:12:05 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 17:12:05 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 17:12:05 compute-0 systemd[1]: Reloading.
Feb 16 17:12:05 compute-0 systemd-rc-local-generator[130920]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:12:05 compute-0 systemd-sysv-generator[130925]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:12:06 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 17:12:08 compute-0 sudo[112621]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:09 compute-0 sudo[134723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hisqnhlmtjuvrhdldqsyadgafiitfnvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261928.8415816-652-268987458761408/AnsiballZ_systemd.py'
Feb 16 17:12:09 compute-0 sudo[134723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:09 compute-0 podman[134622]: 2026-02-16 17:12:09.411305451 +0000 UTC m=+0.058716169 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:12:09 compute-0 python3.9[134757]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 17:12:09 compute-0 systemd[1]: Reloading.
Feb 16 17:12:09 compute-0 systemd-sysv-generator[135289]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:12:09 compute-0 systemd-rc-local-generator[135281]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:12:10 compute-0 sudo[134723]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:10 compute-0 sudo[136099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrjrjjimgzihtulbcrkjljzoausbvuvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261930.1027086-652-51853909958150/AnsiballZ_systemd.py'
Feb 16 17:12:10 compute-0 sudo[136099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:10 compute-0 python3.9[136123]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 17:12:11 compute-0 systemd[1]: Reloading.
Feb 16 17:12:11 compute-0 systemd-rc-local-generator[137678]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:12:11 compute-0 systemd-sysv-generator[137682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:12:12 compute-0 sudo[136099]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:12 compute-0 sudo[138432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znjdehffdgnepqyqvqscigehoibgmayo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261932.1429121-652-121676213301399/AnsiballZ_systemd.py'
Feb 16 17:12:12 compute-0 sudo[138432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:12 compute-0 python3.9[138464]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 17:12:12 compute-0 systemd[1]: Reloading.
Feb 16 17:12:12 compute-0 systemd-sysv-generator[138989]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:12:12 compute-0 systemd-rc-local-generator[138986]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:12:13 compute-0 sudo[138432]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:13 compute-0 sudo[139665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqwhgcbgyeeairehxkcwjydgpowkpgnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261933.1377413-652-122488282518000/AnsiballZ_systemd.py'
Feb 16 17:12:13 compute-0 sudo[139665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:13 compute-0 python3.9[139684]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 17:12:13 compute-0 systemd[1]: Reloading.
Feb 16 17:12:13 compute-0 systemd-sysv-generator[140106]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:12:13 compute-0 systemd-rc-local-generator[140101]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:12:13 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 17:12:13 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 17:12:13 compute-0 systemd[1]: man-db-cache-update.service: Consumed 9.403s CPU time.
Feb 16 17:12:13 compute-0 systemd[1]: run-r1b2a46630ab341a7a00f406e15d6326e.service: Deactivated successfully.
Feb 16 17:12:13 compute-0 sudo[139665]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:14 compute-0 sudo[140264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jesfvsoprjcduomspjwbjztnqjgovsck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261934.13211-710-94397216297175/AnsiballZ_systemd.py'
Feb 16 17:12:14 compute-0 sudo[140264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:14 compute-0 python3.9[140266]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:14 compute-0 systemd[1]: Reloading.
Feb 16 17:12:14 compute-0 systemd-sysv-generator[140295]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:12:14 compute-0 systemd-rc-local-generator[140289]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:12:15 compute-0 sudo[140264]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:15 compute-0 sudo[140461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmztbgaodprpaiyvsscshrzkceusvbdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261935.1529145-710-131309042777644/AnsiballZ_systemd.py'
Feb 16 17:12:15 compute-0 sudo[140461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:15 compute-0 python3.9[140463]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:15 compute-0 systemd[1]: Reloading.
Feb 16 17:12:15 compute-0 systemd-rc-local-generator[140493]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:12:15 compute-0 systemd-sysv-generator[140498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:12:16 compute-0 sudo[140461]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:16 compute-0 sudo[140667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnlrbmtbuyepbmkswkwttfedifgcwyxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261936.195384-710-46304868798432/AnsiballZ_systemd.py'
Feb 16 17:12:16 compute-0 sudo[140667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:16 compute-0 podman[140632]: 2026-02-16 17:12:16.522237649 +0000 UTC m=+0.093000139 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 17:12:16 compute-0 python3.9[140677]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:16 compute-0 systemd[1]: Reloading.
Feb 16 17:12:16 compute-0 systemd-rc-local-generator[140711]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:12:16 compute-0 systemd-sysv-generator[140717]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:12:17 compute-0 sudo[140667]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:17 compute-0 sudo[140881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugqovdnebsqwewtmqrlljzakamziiaea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261937.2364304-710-141836087503608/AnsiballZ_systemd.py'
Feb 16 17:12:17 compute-0 sudo[140881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:17 compute-0 python3.9[140883]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:17 compute-0 sudo[140881]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:18 compute-0 sudo[141036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgytodrmqmzxlutypmhkbdugcpnfgkyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261938.0521824-710-13646749533697/AnsiballZ_systemd.py'
Feb 16 17:12:18 compute-0 sudo[141036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:18 compute-0 python3.9[141038]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:18 compute-0 systemd[1]: Reloading.
Feb 16 17:12:18 compute-0 systemd-rc-local-generator[141065]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:12:18 compute-0 systemd-sysv-generator[141069]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:12:18 compute-0 sudo[141036]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:19 compute-0 sudo[141233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqffaxnhpxljfzygfweisnkcdjvtoolx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261939.1453319-782-262413131732454/AnsiballZ_systemd.py'
Feb 16 17:12:19 compute-0 sudo[141233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:19 compute-0 python3.9[141235]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 16 17:12:19 compute-0 systemd[1]: Reloading.
Feb 16 17:12:19 compute-0 systemd-sysv-generator[141265]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:12:19 compute-0 systemd-rc-local-generator[141255]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:12:19 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Feb 16 17:12:20 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Feb 16 17:12:20 compute-0 sudo[141233]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:20 compute-0 sudo[141433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwizqihozuqysldzkgreznhjybkdtknt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261940.189988-798-215790750226112/AnsiballZ_systemd.py'
Feb 16 17:12:20 compute-0 sudo[141433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:20 compute-0 python3.9[141435]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:20 compute-0 sudo[141433]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:21 compute-0 sudo[141588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzgyrzktpcgltwbjagmyusntlrocazcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261940.9496946-798-241944421365054/AnsiballZ_systemd.py'
Feb 16 17:12:21 compute-0 sudo[141588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:21 compute-0 python3.9[141590]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:21 compute-0 sudo[141588]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:21 compute-0 sudo[141743]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izkrfohmfzzzuwnkouhjccbqcwasgacx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261941.6922705-798-183294497223999/AnsiballZ_systemd.py'
Feb 16 17:12:21 compute-0 sudo[141743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:22 compute-0 python3.9[141745]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:22 compute-0 sudo[141743]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:22 compute-0 sudo[141898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzsfabeasnkatnkvocxieknilkswdxwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261942.484521-798-240111532579415/AnsiballZ_systemd.py'
Feb 16 17:12:22 compute-0 sudo[141898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:23 compute-0 python3.9[141900]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:23 compute-0 sudo[141898]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:23 compute-0 sudo[142053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hebnldmyslzsalowtwkwahuzlorbywme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261943.1981723-798-22182912246169/AnsiballZ_systemd.py'
Feb 16 17:12:23 compute-0 sudo[142053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:23 compute-0 python3.9[142055]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:23 compute-0 sudo[142053]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:24 compute-0 sudo[142208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uulybgzbrwuqlqprmqbqqbwucdtrjhfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261943.9991324-798-207269604370738/AnsiballZ_systemd.py'
Feb 16 17:12:24 compute-0 sudo[142208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:24 compute-0 python3.9[142210]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:24 compute-0 sudo[142208]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:25 compute-0 sudo[142363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieygjxbzlwacqfwenzhemodeglcutkkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261944.75298-798-38739722706470/AnsiballZ_systemd.py'
Feb 16 17:12:25 compute-0 sudo[142363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:25 compute-0 python3.9[142365]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:25 compute-0 sudo[142363]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:25 compute-0 sudo[142518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiddqoykyejbmqmqurezjugoutedjadr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261945.4793346-798-235389642748700/AnsiballZ_systemd.py'
Feb 16 17:12:25 compute-0 sudo[142518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:25 compute-0 python3.9[142520]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:26 compute-0 sudo[142518]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:26 compute-0 sudo[142673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njgrgksrgjalriazrgxbsqkqanxvorxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261946.1773348-798-192253593964342/AnsiballZ_systemd.py'
Feb 16 17:12:26 compute-0 sudo[142673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:26 compute-0 python3.9[142675]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:26 compute-0 sudo[142673]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:27 compute-0 sudo[142828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jeoreujquavxwvnncwcuecbuvxwfukra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261946.9414256-798-106579507623256/AnsiballZ_systemd.py'
Feb 16 17:12:27 compute-0 sudo[142828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:27 compute-0 python3.9[142830]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:27 compute-0 sudo[142828]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:27 compute-0 sudo[142983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhbxeoqvftjprnkgvhjdeodayvxwylyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261947.6794355-798-105996954799644/AnsiballZ_systemd.py'
Feb 16 17:12:27 compute-0 sudo[142983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:28 compute-0 python3.9[142985]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:28 compute-0 sudo[142983]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:28 compute-0 sudo[143138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvorddkzzrhxuqlvrxbixklhjfoamhcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261948.3964565-798-6204934122124/AnsiballZ_systemd.py'
Feb 16 17:12:28 compute-0 sudo[143138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:28 compute-0 python3.9[143140]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:29 compute-0 sudo[143138]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:29 compute-0 sudo[143293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifnttkjdmpuannaauxvcldytskkemvrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261949.166337-798-268872675687434/AnsiballZ_systemd.py'
Feb 16 17:12:29 compute-0 sudo[143293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:29 compute-0 python3.9[143295]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:29 compute-0 sudo[143293]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:30 compute-0 sudo[143448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpgjqdejeqjgjwnzojzibzimrjssewaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261949.9074836-798-96617538294876/AnsiballZ_systemd.py'
Feb 16 17:12:30 compute-0 sudo[143448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:30 compute-0 python3.9[143450]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 16 17:12:30 compute-0 sudo[143448]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:31 compute-0 sudo[143603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvwfvaubbhwzdjqsyrrfkujxqgeqgata ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261950.9396663-1002-236416341879552/AnsiballZ_file.py'
Feb 16 17:12:31 compute-0 sudo[143603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:31 compute-0 python3.9[143605]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:12:31 compute-0 sudo[143603]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:31 compute-0 sudo[143755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-figicnhmvrfquzsxruynmqlmkyerbtef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261951.500638-1002-185285446665874/AnsiballZ_file.py'
Feb 16 17:12:31 compute-0 sudo[143755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:31 compute-0 python3.9[143757]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:12:32 compute-0 sudo[143755]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:32 compute-0 sudo[143907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjavihtiyxywufjeatbzpcnpqjuljejg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261952.1339102-1002-271309891342888/AnsiballZ_file.py'
Feb 16 17:12:32 compute-0 sudo[143907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:32 compute-0 python3.9[143909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:12:32 compute-0 sudo[143907]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:32 compute-0 sudo[144059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arbyysxzkjjyhekxtbcjdwznedwpepnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261952.7114687-1002-62854950517628/AnsiballZ_file.py'
Feb 16 17:12:32 compute-0 sudo[144059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:33 compute-0 python3.9[144061]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:12:33 compute-0 sudo[144059]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:33 compute-0 sudo[144211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tapstpksmfwuqaipcewppxyklwvnvrib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261953.3282592-1002-248836518658546/AnsiballZ_file.py'
Feb 16 17:12:33 compute-0 sudo[144211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:33 compute-0 python3.9[144213]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:12:33 compute-0 sudo[144211]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:34 compute-0 sudo[144363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgcsagqtuzipyuiiuodgakmjykdljjnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261953.9178905-1002-146064439253712/AnsiballZ_file.py'
Feb 16 17:12:34 compute-0 sudo[144363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:34 compute-0 python3.9[144365]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:12:34 compute-0 sudo[144363]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:35 compute-0 python3.9[144515]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:12:36 compute-0 sudo[144665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnfezljscavdagryoeffpeghykpowymr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261955.607275-1104-102898521235878/AnsiballZ_stat.py'
Feb 16 17:12:36 compute-0 sudo[144665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:36 compute-0 python3.9[144667]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:36 compute-0 sudo[144665]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:36 compute-0 sudo[144790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dixwguyzzomtjoyypxyhxfygjzxrbbqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261955.607275-1104-102898521235878/AnsiballZ_copy.py'
Feb 16 17:12:36 compute-0 sudo[144790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:36 compute-0 python3.9[144792]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771261955.607275-1104-102898521235878/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:36 compute-0 sudo[144790]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:37 compute-0 sudo[144942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivmucxjugnjpgxxqceiztduttsckkaxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261956.965386-1104-280289450760991/AnsiballZ_stat.py'
Feb 16 17:12:37 compute-0 sudo[144942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:37 compute-0 python3.9[144944]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:37 compute-0 sudo[144942]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:37 compute-0 sudo[145067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tndhkowybncwakhifkmizkyindkbylin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261956.965386-1104-280289450760991/AnsiballZ_copy.py'
Feb 16 17:12:37 compute-0 sudo[145067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:37 compute-0 python3.9[145069]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771261956.965386-1104-280289450760991/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:37 compute-0 sudo[145067]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:12:38.137 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:12:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:12:38.138 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:12:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:12:38.138 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:12:38 compute-0 sudo[145219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slgdcqgvdcsyaqowcbgkbidgnnwruubh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261958.0480738-1104-220539618450642/AnsiballZ_stat.py'
Feb 16 17:12:38 compute-0 sudo[145219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:38 compute-0 python3.9[145221]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:38 compute-0 sudo[145219]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:38 compute-0 sudo[145344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkopynmiyolxdhmxgzcfwwqfzsnmqxfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261958.0480738-1104-220539618450642/AnsiballZ_copy.py'
Feb 16 17:12:38 compute-0 sudo[145344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:39 compute-0 python3.9[145346]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771261958.0480738-1104-220539618450642/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:39 compute-0 sudo[145344]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:39 compute-0 sudo[145496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqpwenpkjzwmwzdbhlapvjwkvbldwnyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261959.197822-1104-58884223222678/AnsiballZ_stat.py'
Feb 16 17:12:39 compute-0 sudo[145496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:39 compute-0 podman[145498]: 2026-02-16 17:12:39.546460679 +0000 UTC m=+0.064275921 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Feb 16 17:12:39 compute-0 python3.9[145499]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:39 compute-0 sudo[145496]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:40 compute-0 sudo[145641]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htqtbmfiubatttbhlvtnebnrqxmkfxoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261959.197822-1104-58884223222678/AnsiballZ_copy.py'
Feb 16 17:12:40 compute-0 sudo[145641]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:40 compute-0 python3.9[145643]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771261959.197822-1104-58884223222678/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:40 compute-0 sudo[145641]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:40 compute-0 sudo[145793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqgybntbnwnxwxhmodrxabtzgguwncsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261960.4624524-1104-76082631788799/AnsiballZ_stat.py'
Feb 16 17:12:40 compute-0 sudo[145793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:40 compute-0 python3.9[145795]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:40 compute-0 sudo[145793]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:41 compute-0 sudo[145918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sngplnwbajktatgfaqljnxvwmxxmyoix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261960.4624524-1104-76082631788799/AnsiballZ_copy.py'
Feb 16 17:12:41 compute-0 sudo[145918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:41 compute-0 python3.9[145920]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771261960.4624524-1104-76082631788799/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:41 compute-0 sudo[145918]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:41 compute-0 sudo[146070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekwqsviyfzcelejirisadjsyyfwxacll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261961.5801108-1104-227774821564094/AnsiballZ_stat.py'
Feb 16 17:12:41 compute-0 sudo[146070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:41 compute-0 python3.9[146072]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:42 compute-0 sudo[146070]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:42 compute-0 sudo[146195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smcarmwgnyjnsdyeouhhdaejwkedvtof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261961.5801108-1104-227774821564094/AnsiballZ_copy.py'
Feb 16 17:12:42 compute-0 sudo[146195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:42 compute-0 python3.9[146197]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771261961.5801108-1104-227774821564094/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:42 compute-0 sudo[146195]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:42 compute-0 sudo[146347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xogmvleqfzycldxfiwdnwqllncsumoet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261962.659415-1104-151538556619387/AnsiballZ_stat.py'
Feb 16 17:12:42 compute-0 sudo[146347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:43 compute-0 python3.9[146349]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:43 compute-0 sudo[146347]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:43 compute-0 sudo[146470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npbvgfemririksvmnjlpfndsjznloepq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261962.659415-1104-151538556619387/AnsiballZ_copy.py'
Feb 16 17:12:43 compute-0 sudo[146470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:43 compute-0 python3.9[146472]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771261962.659415-1104-151538556619387/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:43 compute-0 sudo[146470]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:44 compute-0 sudo[146622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-equmenwoytvonqrxolfqajxzbwevardn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261963.79151-1104-183804585374145/AnsiballZ_stat.py'
Feb 16 17:12:44 compute-0 sudo[146622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:44 compute-0 python3.9[146624]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:44 compute-0 sudo[146622]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:44 compute-0 sudo[146747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmqizxwzetgxmxgwnoonggteetbtaulr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261963.79151-1104-183804585374145/AnsiballZ_copy.py'
Feb 16 17:12:44 compute-0 sudo[146747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:44 compute-0 python3.9[146749]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771261963.79151-1104-183804585374145/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:45 compute-0 sudo[146747]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:45 compute-0 sudo[146899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzixkfpwsfiednziefboibpmzilbcjee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261965.1650593-1330-26967010327746/AnsiballZ_command.py'
Feb 16 17:12:45 compute-0 sudo[146899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:45 compute-0 python3.9[146901]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Feb 16 17:12:45 compute-0 sudo[146899]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:46 compute-0 sudo[147052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naacuhmlkpmfkujwkixluglnchaxkhlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261965.8251026-1348-9202878286964/AnsiballZ_file.py'
Feb 16 17:12:46 compute-0 sudo[147052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:46 compute-0 python3.9[147054]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:46 compute-0 sudo[147052]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:46 compute-0 sudo[147223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vejfkdwcjymsklqoaucewrzcfzhoyerd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261966.3951676-1348-223354978374827/AnsiballZ_file.py'
Feb 16 17:12:46 compute-0 sudo[147223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:46 compute-0 podman[147178]: 2026-02-16 17:12:46.677280874 +0000 UTC m=+0.079759212 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Feb 16 17:12:46 compute-0 python3.9[147227]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:46 compute-0 sudo[147223]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:47 compute-0 sudo[147382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypjdqyargiwiwyppclpcpjbdbagjypyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261966.9863517-1348-51293185757232/AnsiballZ_file.py'
Feb 16 17:12:47 compute-0 sudo[147382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:47 compute-0 python3.9[147384]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:47 compute-0 sudo[147382]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:47 compute-0 sudo[147534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuztohbmihasudyelnkdcwdtfvlydupj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261967.5676534-1348-114226091788425/AnsiballZ_file.py'
Feb 16 17:12:47 compute-0 sudo[147534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:48 compute-0 python3.9[147536]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:48 compute-0 sudo[147534]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:48 compute-0 sudo[147686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auiyxibkwiibsnbpycrbipwxhvctcqop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261968.215245-1348-182334246534687/AnsiballZ_file.py'
Feb 16 17:12:48 compute-0 sudo[147686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:48 compute-0 python3.9[147688]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:48 compute-0 sudo[147686]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:49 compute-0 sudo[147838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rciqdqndecpkkswefmbglzktawrdchaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261968.7822342-1348-22939771631717/AnsiballZ_file.py'
Feb 16 17:12:49 compute-0 sudo[147838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:49 compute-0 python3.9[147840]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:49 compute-0 sudo[147838]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:49 compute-0 sudo[147990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbcajparwqayvkfsgmrqtgvsxulnlpcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261969.3564427-1348-94857124318920/AnsiballZ_file.py'
Feb 16 17:12:49 compute-0 sudo[147990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:49 compute-0 python3.9[147992]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:49 compute-0 sudo[147990]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:50 compute-0 sudo[148142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjkgpkpwouhnjweeexoolbqcjtspdubq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261969.9457338-1348-125696544326193/AnsiballZ_file.py'
Feb 16 17:12:50 compute-0 sudo[148142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:50 compute-0 python3.9[148144]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:50 compute-0 sudo[148142]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:50 compute-0 sudo[148294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbljntwcztifspoizjijsjfwvojcuylq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261970.5598962-1348-42533879013833/AnsiballZ_file.py'
Feb 16 17:12:50 compute-0 sudo[148294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:51 compute-0 python3.9[148296]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:51 compute-0 sudo[148294]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:51 compute-0 sudo[148446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itgffzubupdpcfinylkzhstfvyidppac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261971.1659594-1348-259457324133591/AnsiballZ_file.py'
Feb 16 17:12:51 compute-0 sudo[148446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:51 compute-0 python3.9[148448]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:51 compute-0 sudo[148446]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:52 compute-0 sudo[148598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgcumtwlfkbegwxnnvxhmsbabjylawpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261971.772234-1348-6282682922124/AnsiballZ_file.py'
Feb 16 17:12:52 compute-0 sudo[148598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:52 compute-0 python3.9[148600]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:52 compute-0 sudo[148598]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:52 compute-0 sudo[148750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckjllszrzedvxzmtjohmhyesdjzoaozr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261972.6014392-1348-226316755548011/AnsiballZ_file.py'
Feb 16 17:12:52 compute-0 sudo[148750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:53 compute-0 python3.9[148752]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:53 compute-0 sudo[148750]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:53 compute-0 sudo[148902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxhyjouifhpgcelmvlytbnuscbovfyde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261973.1278753-1348-77379642753203/AnsiballZ_file.py'
Feb 16 17:12:53 compute-0 sudo[148902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:53 compute-0 python3.9[148904]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:53 compute-0 sudo[148902]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:54 compute-0 sudo[149054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijdtgyzfjcgeqqovzwfwnzhtjdgxbzqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261973.7451594-1348-258697149345188/AnsiballZ_file.py'
Feb 16 17:12:54 compute-0 sudo[149054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:54 compute-0 python3.9[149056]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:54 compute-0 sudo[149054]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:54 compute-0 sudo[149206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkpefjfddkotyaokikchpnicgnzkqfig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261974.3957944-1546-213142356156287/AnsiballZ_stat.py'
Feb 16 17:12:54 compute-0 sudo[149206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:54 compute-0 python3.9[149208]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:54 compute-0 sudo[149206]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:55 compute-0 sudo[149329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfttnzbmzwnvoqrneaevyifwyeftlgly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261974.3957944-1546-213142356156287/AnsiballZ_copy.py'
Feb 16 17:12:55 compute-0 sudo[149329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:55 compute-0 python3.9[149331]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261974.3957944-1546-213142356156287/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:55 compute-0 sudo[149329]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:55 compute-0 sudo[149481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywtfxlfuodmfbzqvogquehixsclyuarx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261975.5125282-1546-125048443021976/AnsiballZ_stat.py'
Feb 16 17:12:55 compute-0 sudo[149481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:55 compute-0 python3.9[149483]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:55 compute-0 sudo[149481]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:56 compute-0 sudo[149604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmurthzovdiurxjcaveytqmafyihqxwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261975.5125282-1546-125048443021976/AnsiballZ_copy.py'
Feb 16 17:12:56 compute-0 sudo[149604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:56 compute-0 python3.9[149606]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261975.5125282-1546-125048443021976/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:56 compute-0 sudo[149604]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:56 compute-0 sudo[149756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxanmpritaetzuahaxntlqbshjhebrjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261976.5832276-1546-274202211705508/AnsiballZ_stat.py'
Feb 16 17:12:56 compute-0 sudo[149756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:56 compute-0 python3.9[149758]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:57 compute-0 sudo[149756]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:57 compute-0 sudo[149879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owkpxkolnynczwhqheddrjmvnwposaum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261976.5832276-1546-274202211705508/AnsiballZ_copy.py'
Feb 16 17:12:57 compute-0 sudo[149879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:57 compute-0 python3.9[149881]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261976.5832276-1546-274202211705508/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:57 compute-0 sudo[149879]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:57 compute-0 sudo[150031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jejhjzsybjshnrnfjqalzmiqjyalelan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261977.6542683-1546-179265174527164/AnsiballZ_stat.py'
Feb 16 17:12:57 compute-0 sudo[150031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:58 compute-0 python3.9[150033]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:58 compute-0 sudo[150031]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:58 compute-0 sudo[150154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jftpbjvawtsrqecarppgffuvbrbhpyzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261977.6542683-1546-179265174527164/AnsiballZ_copy.py'
Feb 16 17:12:58 compute-0 sudo[150154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:58 compute-0 python3.9[150156]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261977.6542683-1546-179265174527164/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:58 compute-0 sudo[150154]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:59 compute-0 sudo[150306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkhrqwhvkayddjxkbkabbuisxqiyijcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261978.8143084-1546-184591375792366/AnsiballZ_stat.py'
Feb 16 17:12:59 compute-0 sudo[150306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:59 compute-0 python3.9[150308]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:12:59 compute-0 sudo[150306]: pam_unix(sudo:session): session closed for user root
Feb 16 17:12:59 compute-0 sudo[150429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kveqyrjkiibajpyfzdmrgyvoevcbieet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261978.8143084-1546-184591375792366/AnsiballZ_copy.py'
Feb 16 17:12:59 compute-0 sudo[150429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:12:59 compute-0 python3.9[150431]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261978.8143084-1546-184591375792366/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:12:59 compute-0 sudo[150429]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:00 compute-0 sudo[150581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwuyczyeujoksfsjjqtywefgdgilczls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261979.8927581-1546-191478405440253/AnsiballZ_stat.py'
Feb 16 17:13:00 compute-0 sudo[150581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:00 compute-0 python3.9[150583]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:00 compute-0 sudo[150581]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:00 compute-0 sudo[150704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhyyippfsdruaktfizvmevdijqoysapr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261979.8927581-1546-191478405440253/AnsiballZ_copy.py'
Feb 16 17:13:00 compute-0 sudo[150704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:00 compute-0 python3.9[150706]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261979.8927581-1546-191478405440253/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:00 compute-0 sudo[150704]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:01 compute-0 sudo[150856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtftmnfqwqzycslggvqrijwrohfopjbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261980.9054387-1546-126018137379625/AnsiballZ_stat.py'
Feb 16 17:13:01 compute-0 sudo[150856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:01 compute-0 python3.9[150858]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:01 compute-0 sudo[150856]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:01 compute-0 sudo[150979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phmrltobtnzxrdjdzfxvsberuenxrckk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261980.9054387-1546-126018137379625/AnsiballZ_copy.py'
Feb 16 17:13:01 compute-0 sudo[150979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:01 compute-0 python3.9[150981]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261980.9054387-1546-126018137379625/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:01 compute-0 sudo[150979]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:02 compute-0 sudo[151131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlidrudkusrgsxtormhujxunawdgdeeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261981.968934-1546-93397728266478/AnsiballZ_stat.py'
Feb 16 17:13:02 compute-0 sudo[151131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:02 compute-0 python3.9[151133]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:02 compute-0 sudo[151131]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:02 compute-0 sudo[151254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vniictskpyhehjhvtbhljwacdfcnbiwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261981.968934-1546-93397728266478/AnsiballZ_copy.py'
Feb 16 17:13:02 compute-0 sudo[151254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:02 compute-0 python3.9[151256]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261981.968934-1546-93397728266478/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:02 compute-0 sudo[151254]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:03 compute-0 sudo[151406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggkpqlataskkuqmbfnraqlyndvwyatyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261983.1054401-1546-208787496316622/AnsiballZ_stat.py'
Feb 16 17:13:03 compute-0 sudo[151406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:03 compute-0 python3.9[151408]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:03 compute-0 sudo[151406]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:03 compute-0 sudo[151529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-illgunztfllxyyrfwjtdvkqrdqmhdppo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261983.1054401-1546-208787496316622/AnsiballZ_copy.py'
Feb 16 17:13:03 compute-0 sudo[151529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:04 compute-0 python3.9[151531]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261983.1054401-1546-208787496316622/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:04 compute-0 sudo[151529]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:04 compute-0 sudo[151681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdxwoostqsligoqgnidvtbemehctlggh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261984.167347-1546-227633273410515/AnsiballZ_stat.py'
Feb 16 17:13:04 compute-0 sudo[151681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:04 compute-0 python3.9[151683]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:04 compute-0 sudo[151681]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:04 compute-0 sudo[151804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzsskcwyzupvbvvwqntxlbwhzyishgao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261984.167347-1546-227633273410515/AnsiballZ_copy.py'
Feb 16 17:13:04 compute-0 sudo[151804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:05 compute-0 python3.9[151806]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261984.167347-1546-227633273410515/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:05 compute-0 sudo[151804]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:05 compute-0 sudo[151956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvldifaxyxbxyqireyzwfibujbocrrkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261985.1995516-1546-70545827942354/AnsiballZ_stat.py'
Feb 16 17:13:05 compute-0 sudo[151956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:05 compute-0 python3.9[151958]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:05 compute-0 sudo[151956]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:05 compute-0 sudo[152079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqhyxtbergqlnzubiawldvthzvarjqdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261985.1995516-1546-70545827942354/AnsiballZ_copy.py'
Feb 16 17:13:05 compute-0 sudo[152079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:06 compute-0 python3.9[152081]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261985.1995516-1546-70545827942354/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:06 compute-0 sudo[152079]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:06 compute-0 sudo[152231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqwyhccedndieljigovzpozquahmnltt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261986.2813165-1546-264063546756188/AnsiballZ_stat.py'
Feb 16 17:13:06 compute-0 sudo[152231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:06 compute-0 python3.9[152233]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:06 compute-0 sudo[152231]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:07 compute-0 sudo[152354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yalzfgzfbmpomsnqluajvipuenooaoqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261986.2813165-1546-264063546756188/AnsiballZ_copy.py'
Feb 16 17:13:07 compute-0 sudo[152354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:07 compute-0 python3.9[152356]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261986.2813165-1546-264063546756188/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:07 compute-0 sudo[152354]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:07 compute-0 sudo[152506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrfrinaisiuzrhwustralsiilfcafhgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261987.5236297-1546-278352110495580/AnsiballZ_stat.py'
Feb 16 17:13:07 compute-0 sudo[152506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:07 compute-0 python3.9[152508]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:07 compute-0 sudo[152506]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:08 compute-0 sudo[152629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilpktvpvfsgvvaqupsdudaaoyabgprpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261987.5236297-1546-278352110495580/AnsiballZ_copy.py'
Feb 16 17:13:08 compute-0 sudo[152629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:08 compute-0 python3.9[152631]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261987.5236297-1546-278352110495580/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:08 compute-0 sudo[152629]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:08 compute-0 sudo[152781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boekhjdptvmmrfimtohgmwsdueiaqaqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261988.6859045-1546-200280219957785/AnsiballZ_stat.py'
Feb 16 17:13:08 compute-0 sudo[152781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:09 compute-0 python3.9[152783]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:09 compute-0 sudo[152781]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:09 compute-0 sudo[152904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etahejzbihhqdptzcmlskiobizglivzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261988.6859045-1546-200280219957785/AnsiballZ_copy.py'
Feb 16 17:13:09 compute-0 sudo[152904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:09 compute-0 python3.9[152906]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771261988.6859045-1546-200280219957785/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:09 compute-0 sudo[152904]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:10 compute-0 podman[153030]: 2026-02-16 17:13:10.080238607 +0000 UTC m=+0.046657818 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 17:13:10 compute-0 python3.9[153073]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:13:10 compute-0 sudo[153228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pksypxfbezhoczvuqhgxalbfnyhjpaey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261990.4906414-1958-270278448841858/AnsiballZ_seboolean.py'
Feb 16 17:13:10 compute-0 sudo[153228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:11 compute-0 python3.9[153230]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 16 17:13:12 compute-0 sudo[153228]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:12 compute-0 sudo[153384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ordggpyrrajdvqzgauhjftodqshtznxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261992.2175667-1974-107255052082690/AnsiballZ_copy.py'
Feb 16 17:13:12 compute-0 dbus-broker-launch[807]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 16 17:13:12 compute-0 sudo[153384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:12 compute-0 python3.9[153386]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:12 compute-0 sudo[153384]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:13 compute-0 sudo[153536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmcbfmiahtsajoywaovliaajwildapng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261993.0772645-1974-195862797993210/AnsiballZ_copy.py'
Feb 16 17:13:13 compute-0 sudo[153536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:13 compute-0 python3.9[153538]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:13 compute-0 sudo[153536]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:13 compute-0 sudo[153688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqufrthlxeifmrvreyekthtfxwwiuvsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261993.7112992-1974-121570630986830/AnsiballZ_copy.py'
Feb 16 17:13:13 compute-0 sudo[153688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:14 compute-0 python3.9[153690]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:14 compute-0 sudo[153688]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:14 compute-0 sudo[153840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhlfqzibxkkkfjfuefhkugkeofqzimog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261994.3439267-1974-59318681752307/AnsiballZ_copy.py'
Feb 16 17:13:14 compute-0 sudo[153840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:14 compute-0 python3.9[153842]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:14 compute-0 sudo[153840]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:15 compute-0 sudo[153992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdctvvqcecqnhwhxkoyhotwavodfexxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261994.9004502-1974-204517726484954/AnsiballZ_copy.py'
Feb 16 17:13:15 compute-0 sudo[153992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:15 compute-0 python3.9[153994]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:15 compute-0 sudo[153992]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:15 compute-0 sudo[154144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snjvndxximsbnjskfqtewsidryghtswc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261995.5600555-2046-172826132170215/AnsiballZ_copy.py'
Feb 16 17:13:15 compute-0 sudo[154144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:15 compute-0 python3.9[154146]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:16 compute-0 sudo[154144]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:16 compute-0 sudo[154296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjzemujvnaudqwhrryvwygepnmrrqfqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261996.1804624-2046-253169050309165/AnsiballZ_copy.py'
Feb 16 17:13:16 compute-0 sudo[154296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:16 compute-0 python3.9[154298]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:16 compute-0 sudo[154296]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:17 compute-0 sudo[154459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llwozdnmfixgkgxvrgrlgovmyuqefrow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261996.7500467-2046-113619185151928/AnsiballZ_copy.py'
Feb 16 17:13:17 compute-0 sudo[154459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:17 compute-0 podman[154422]: 2026-02-16 17:13:17.059222091 +0000 UTC m=+0.093023180 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:13:17 compute-0 python3.9[154467]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:17 compute-0 sudo[154459]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:17 compute-0 sudo[154626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hojejgissnjqrvgjybjklkgiqufxghth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261997.355131-2046-118239865951322/AnsiballZ_copy.py'
Feb 16 17:13:17 compute-0 sudo[154626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:17 compute-0 python3.9[154628]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:17 compute-0 sudo[154626]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:18 compute-0 sudo[154778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzvqchomwdezbkkkifwhjnqxnxtmuqep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261997.92558-2046-78651840357805/AnsiballZ_copy.py'
Feb 16 17:13:18 compute-0 sudo[154778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:18 compute-0 python3.9[154780]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:18 compute-0 sudo[154778]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:18 compute-0 sudo[154930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrmuhvjkzjtasqjvwupxcliqejdvwtde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261998.5670068-2118-58795031235791/AnsiballZ_systemd.py'
Feb 16 17:13:18 compute-0 sudo[154930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:19 compute-0 python3.9[154932]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:13:19 compute-0 systemd[1]: Reloading.
Feb 16 17:13:19 compute-0 systemd-rc-local-generator[154960]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:13:19 compute-0 systemd-sysv-generator[154966]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:13:19 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Feb 16 17:13:19 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Feb 16 17:13:19 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Feb 16 17:13:19 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 16 17:13:19 compute-0 systemd[1]: Starting libvirt logging daemon...
Feb 16 17:13:19 compute-0 systemd[1]: Started libvirt logging daemon.
Feb 16 17:13:19 compute-0 sudo[154930]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:19 compute-0 sudo[155130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xowmegszydhpderqlozxckykxymsupvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771261999.64081-2118-197543841073417/AnsiballZ_systemd.py'
Feb 16 17:13:19 compute-0 sudo[155130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:20 compute-0 python3.9[155132]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:13:20 compute-0 systemd[1]: Reloading.
Feb 16 17:13:20 compute-0 systemd-rc-local-generator[155158]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:13:20 compute-0 systemd-sysv-generator[155163]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:13:20 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Feb 16 17:13:20 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 16 17:13:20 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 16 17:13:20 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 16 17:13:20 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 16 17:13:20 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 16 17:13:20 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 16 17:13:20 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 16 17:13:20 compute-0 sudo[155130]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:21 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 16 17:13:21 compute-0 sudo[155353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zavyaxwmnpaorludkjmxvskqirpmrrin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262000.9488938-2118-4665516772375/AnsiballZ_systemd.py'
Feb 16 17:13:21 compute-0 sudo[155353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:21 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 16 17:13:21 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 16 17:13:21 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 16 17:13:21 compute-0 python3.9[155355]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:13:21 compute-0 systemd[1]: Reloading.
Feb 16 17:13:21 compute-0 systemd-rc-local-generator[155384]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:13:21 compute-0 systemd-sysv-generator[155389]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:13:21 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 16 17:13:21 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 16 17:13:21 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 16 17:13:21 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 16 17:13:21 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 17:13:21 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 17:13:21 compute-0 sudo[155353]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:22 compute-0 sudo[155580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbziagdtnoklltqwfivwtosaggfdlter ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262002.013979-2118-187171776321058/AnsiballZ_systemd.py'
Feb 16 17:13:22 compute-0 sudo[155580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:22 compute-0 setroubleshoot[155279]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a6173483-6d05-43cc-ba27-4b9f4c943301
Feb 16 17:13:22 compute-0 setroubleshoot[155279]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 16 17:13:22 compute-0 setroubleshoot[155279]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a6173483-6d05-43cc-ba27-4b9f4c943301
Feb 16 17:13:22 compute-0 rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 17:13:22 compute-0 setroubleshoot[155279]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 16 17:13:22 compute-0 python3.9[155582]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:13:22 compute-0 systemd[1]: Reloading.
Feb 16 17:13:22 compute-0 systemd-rc-local-generator[155607]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:13:22 compute-0 systemd-sysv-generator[155612]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:13:22 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Feb 16 17:13:22 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Feb 16 17:13:22 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 16 17:13:22 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 16 17:13:22 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 16 17:13:22 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 16 17:13:22 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 16 17:13:22 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 16 17:13:22 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 16 17:13:22 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 16 17:13:22 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 16 17:13:22 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 16 17:13:22 compute-0 sudo[155580]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:23 compute-0 sudo[155804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwaolaeigrgoxqamrjwzcbtthusnkhnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262003.0477548-2118-194234083524022/AnsiballZ_systemd.py'
Feb 16 17:13:23 compute-0 sudo[155804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:23 compute-0 python3.9[155806]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:13:23 compute-0 systemd[1]: Reloading.
Feb 16 17:13:23 compute-0 systemd-rc-local-generator[155829]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:13:23 compute-0 systemd-sysv-generator[155837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:13:23 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Feb 16 17:13:23 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Feb 16 17:13:23 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Feb 16 17:13:23 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 16 17:13:23 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 16 17:13:23 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 16 17:13:23 compute-0 systemd[1]: Starting libvirt secret daemon...
Feb 16 17:13:23 compute-0 systemd[1]: Started libvirt secret daemon.
Feb 16 17:13:23 compute-0 sudo[155804]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:24 compute-0 sudo[156022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pljmksawaixsazgwmpasiohqfeacpnsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262004.1697307-2192-268994610386063/AnsiballZ_file.py'
Feb 16 17:13:24 compute-0 sudo[156022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:24 compute-0 python3.9[156024]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:24 compute-0 sudo[156022]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:25 compute-0 sudo[156174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neygqldqxvfexlwkwlugomotiqawmjfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262004.7790644-2208-79456371736600/AnsiballZ_find.py'
Feb 16 17:13:25 compute-0 sudo[156174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:25 compute-0 python3.9[156176]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 17:13:25 compute-0 sudo[156174]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:26 compute-0 sudo[156326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpeshylhlrahjxaywnqidajmnaawdlra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262005.8131661-2236-209235625213321/AnsiballZ_stat.py'
Feb 16 17:13:26 compute-0 sudo[156326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:26 compute-0 python3.9[156328]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:26 compute-0 sudo[156326]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:26 compute-0 sudo[156449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brkkrlhupiifvewtiumffxapuyibdkkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262005.8131661-2236-209235625213321/AnsiballZ_copy.py'
Feb 16 17:13:26 compute-0 sudo[156449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:26 compute-0 python3.9[156451]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262005.8131661-2236-209235625213321/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:26 compute-0 sudo[156449]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:27 compute-0 sudo[156601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhaxlyvvggxxiarvxwqjsmxkuaeattjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262007.108253-2268-235729107168950/AnsiballZ_file.py'
Feb 16 17:13:27 compute-0 sudo[156601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:27 compute-0 python3.9[156603]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:27 compute-0 sudo[156601]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:28 compute-0 sudo[156753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtuxqsqhcwxslusjnfliidtfxxswaxcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262007.710147-2284-173605799562707/AnsiballZ_stat.py'
Feb 16 17:13:28 compute-0 sudo[156753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:28 compute-0 python3.9[156755]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:28 compute-0 sudo[156753]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:28 compute-0 sudo[156831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubahlybckaobmvmoqxvomvfsyjtvgues ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262007.710147-2284-173605799562707/AnsiballZ_file.py'
Feb 16 17:13:28 compute-0 sudo[156831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:28 compute-0 python3.9[156833]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:28 compute-0 sudo[156831]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:28 compute-0 sudo[156983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggxfvqiyuyjofpqjemqgvltcllqwnetn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262008.759488-2308-199859067547650/AnsiballZ_stat.py'
Feb 16 17:13:29 compute-0 sudo[156983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:29 compute-0 python3.9[156985]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:29 compute-0 sudo[156983]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:29 compute-0 sudo[157061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwfdocgllftioadrrpfbgrjuklgsrszs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262008.759488-2308-199859067547650/AnsiballZ_file.py'
Feb 16 17:13:29 compute-0 sudo[157061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:29 compute-0 python3.9[157063]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.qwhmk2_c recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:29 compute-0 sudo[157061]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:30 compute-0 sudo[157213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aftoaxaprrlisvbtwzgpcarbnmeqxpjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262009.8224661-2332-63809624718656/AnsiballZ_stat.py'
Feb 16 17:13:30 compute-0 sudo[157213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:30 compute-0 python3.9[157215]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:30 compute-0 sudo[157213]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:30 compute-0 sudo[157291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxyztiwrfhqhoapexjrlywmyxbqvamio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262009.8224661-2332-63809624718656/AnsiballZ_file.py'
Feb 16 17:13:30 compute-0 sudo[157291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:30 compute-0 python3.9[157293]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:30 compute-0 sudo[157291]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:31 compute-0 sudo[157443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmjswkqtttocwhdiussqvgmriqaobijw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262010.9640021-2358-115417235087941/AnsiballZ_command.py'
Feb 16 17:13:31 compute-0 sudo[157443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:31 compute-0 python3.9[157445]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:13:31 compute-0 sudo[157443]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:32 compute-0 sudo[157596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohxdbahjdvaogdkixhdciyofdyoiolmg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771262011.6259775-2374-70538187803561/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 17:13:32 compute-0 sudo[157596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:32 compute-0 python3[157598]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 17:13:32 compute-0 sudo[157596]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:32 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 16 17:13:32 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 16 17:13:32 compute-0 sudo[157748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhgbueoypepphnujbyfvvmqwpxukjuty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262012.4381642-2390-50372943573509/AnsiballZ_stat.py'
Feb 16 17:13:32 compute-0 sudo[157748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:32 compute-0 python3.9[157750]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:32 compute-0 sudo[157748]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:33 compute-0 sudo[157826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpdtmpmgarkdqlkljjxaitaftjefqnvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262012.4381642-2390-50372943573509/AnsiballZ_file.py'
Feb 16 17:13:33 compute-0 sudo[157826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:33 compute-0 python3.9[157828]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:33 compute-0 sudo[157826]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:33 compute-0 sudo[157978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewlrezhozkqvufqljzbpumvldlumiwog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262013.6316926-2414-263392083944997/AnsiballZ_stat.py'
Feb 16 17:13:33 compute-0 sudo[157978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:34 compute-0 python3.9[157980]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:34 compute-0 sudo[157978]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:34 compute-0 sudo[158103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipqyjsxbkljqthxnycigacfzwckvfwwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262013.6316926-2414-263392083944997/AnsiballZ_copy.py'
Feb 16 17:13:34 compute-0 sudo[158103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:34 compute-0 python3.9[158105]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771262013.6316926-2414-263392083944997/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:34 compute-0 sudo[158103]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:35 compute-0 sudo[158255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owrsynxkalipbsistjvfpziymsrulcbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262014.826303-2444-83791757752722/AnsiballZ_stat.py'
Feb 16 17:13:35 compute-0 sudo[158255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:35 compute-0 python3.9[158257]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:35 compute-0 sudo[158255]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:35 compute-0 sudo[158333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqqkpyhcvoxrciofwiuiqavuyshsyggz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262014.826303-2444-83791757752722/AnsiballZ_file.py'
Feb 16 17:13:35 compute-0 sudo[158333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:35 compute-0 python3.9[158335]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:35 compute-0 sudo[158333]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:36 compute-0 sudo[158485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeakofmcjpsmfrrvmeizkqpknjnwedtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262015.8611798-2468-57943323988754/AnsiballZ_stat.py'
Feb 16 17:13:36 compute-0 sudo[158485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:36 compute-0 python3.9[158487]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:36 compute-0 sudo[158485]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:36 compute-0 sudo[158563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izhqgucyjcwseqjszrbqiuhnhmsgghab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262015.8611798-2468-57943323988754/AnsiballZ_file.py'
Feb 16 17:13:36 compute-0 sudo[158563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:36 compute-0 python3.9[158565]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:36 compute-0 sudo[158563]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:37 compute-0 sudo[158715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nugvogqxyppwcczgxewithskiqlcqznp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262016.9760609-2492-157634606317478/AnsiballZ_stat.py'
Feb 16 17:13:37 compute-0 sudo[158715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:37 compute-0 python3.9[158717]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:37 compute-0 sudo[158715]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:38 compute-0 sudo[158840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gamfdpktekvzvedilfwqhktfouwlksmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262016.9760609-2492-157634606317478/AnsiballZ_copy.py'
Feb 16 17:13:38 compute-0 sudo[158840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:13:38.137 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:13:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:13:38.139 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:13:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:13:38.139 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:13:38 compute-0 python3.9[158842]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771262016.9760609-2492-157634606317478/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:38 compute-0 sudo[158840]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:38 compute-0 sudo[158992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtpjymdiyzoydjdexwhpztewreoqgllq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262018.3890464-2522-92487650345050/AnsiballZ_file.py'
Feb 16 17:13:38 compute-0 sudo[158992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:38 compute-0 python3.9[158994]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:38 compute-0 sudo[158992]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:39 compute-0 sudo[159144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pahiyzipajbjidxmakeoxuxzqksyimjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262018.9871845-2538-102368676456886/AnsiballZ_command.py'
Feb 16 17:13:39 compute-0 sudo[159144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:39 compute-0 python3.9[159146]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:13:39 compute-0 sudo[159144]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:40 compute-0 sudo[159299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lslabpmmzqvqfsshyicsuliswjcacjtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262019.6053505-2554-219074378593242/AnsiballZ_blockinfile.py'
Feb 16 17:13:40 compute-0 sudo[159299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:40 compute-0 python3.9[159301]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:40 compute-0 sudo[159299]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:40 compute-0 sudo[159461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neshhceguhotqvjxbqyfudccaojxursd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262020.4993575-2572-125566363899294/AnsiballZ_command.py'
Feb 16 17:13:40 compute-0 sudo[159461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:40 compute-0 podman[159425]: 2026-02-16 17:13:40.768773996 +0000 UTC m=+0.050286007 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Feb 16 17:13:40 compute-0 python3.9[159464]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:13:40 compute-0 sudo[159461]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:41 compute-0 sudo[159623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msfsvrswaaoadoyyqxvneewcfpiajrkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262021.1090853-2588-2968439394905/AnsiballZ_stat.py'
Feb 16 17:13:41 compute-0 sudo[159623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:41 compute-0 python3.9[159625]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:13:41 compute-0 sudo[159623]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:42 compute-0 sudo[159777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tagspjqsefdagonkhyigpxstsnvthvmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262021.8304484-2604-49204720532065/AnsiballZ_command.py'
Feb 16 17:13:42 compute-0 sudo[159777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:42 compute-0 python3.9[159779]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:13:42 compute-0 sudo[159777]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:42 compute-0 sudo[159933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsluzvxwsdfxqykzxtoxnjunpatowncs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262022.4667263-2620-247007613268208/AnsiballZ_file.py'
Feb 16 17:13:42 compute-0 sudo[159933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:42 compute-0 python3.9[159935]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:42 compute-0 sudo[159933]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:43 compute-0 sudo[160085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpqrfwcbfezubyzuisuqudkezcwfhzlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262023.1241753-2636-257365528407294/AnsiballZ_stat.py'
Feb 16 17:13:43 compute-0 sudo[160085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:43 compute-0 python3.9[160087]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:43 compute-0 sudo[160085]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:43 compute-0 sudo[160208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgctgezguzdzznsaqcriifasjsyhekbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262023.1241753-2636-257365528407294/AnsiballZ_copy.py'
Feb 16 17:13:43 compute-0 sudo[160208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:44 compute-0 python3.9[160210]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262023.1241753-2636-257365528407294/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:44 compute-0 sudo[160208]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:44 compute-0 sudo[160360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjbayicnnbazhtzjlhqrrmbfhocaorby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262024.3884676-2666-131104014226279/AnsiballZ_stat.py'
Feb 16 17:13:44 compute-0 sudo[160360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:44 compute-0 python3.9[160362]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:44 compute-0 sudo[160360]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:45 compute-0 sudo[160483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psrqpxmcjcnckehdhliajjnusughdysl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262024.3884676-2666-131104014226279/AnsiballZ_copy.py'
Feb 16 17:13:45 compute-0 sudo[160483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:45 compute-0 python3.9[160485]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262024.3884676-2666-131104014226279/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:45 compute-0 sudo[160483]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:45 compute-0 sudo[160635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brbnhclbqqlgrdivkqmmcfexjzfkmlek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262025.6696024-2696-145961098080894/AnsiballZ_stat.py'
Feb 16 17:13:45 compute-0 sudo[160635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:46 compute-0 python3.9[160637]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:13:46 compute-0 sudo[160635]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:46 compute-0 sudo[160758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saytjccednzmbirgdpkjtuujisqrtxwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262025.6696024-2696-145961098080894/AnsiballZ_copy.py'
Feb 16 17:13:46 compute-0 sudo[160758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:46 compute-0 python3.9[160760]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262025.6696024-2696-145961098080894/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:13:46 compute-0 sudo[160758]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:47 compute-0 sudo[160910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfvcdpfjtacsxnwvqlarafirqpaltwpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262026.8019328-2726-154178386487403/AnsiballZ_systemd.py'
Feb 16 17:13:47 compute-0 sudo[160910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:47 compute-0 podman[160912]: 2026-02-16 17:13:47.257276486 +0000 UTC m=+0.130415491 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller)
Feb 16 17:13:47 compute-0 python3.9[160913]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:13:47 compute-0 systemd[1]: Reloading.
Feb 16 17:13:47 compute-0 systemd-rc-local-generator[160966]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:13:47 compute-0 systemd-sysv-generator[160969]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:13:47 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Feb 16 17:13:47 compute-0 sudo[160910]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:48 compute-0 sudo[161135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iszkuywvnlherjiidmgalhihtltzkuzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262027.8895633-2742-187663252105593/AnsiballZ_systemd.py'
Feb 16 17:13:48 compute-0 sudo[161135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:13:48 compute-0 python3.9[161137]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 16 17:13:48 compute-0 systemd[1]: Reloading.
Feb 16 17:13:48 compute-0 systemd-rc-local-generator[161165]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:13:48 compute-0 systemd-sysv-generator[161168]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:13:48 compute-0 systemd[1]: Reloading.
Feb 16 17:13:48 compute-0 systemd-sysv-generator[161209]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:13:48 compute-0 systemd-rc-local-generator[161206]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:13:49 compute-0 sudo[161135]: pam_unix(sudo:session): session closed for user root
Feb 16 17:13:49 compute-0 sshd-session[106309]: Connection closed by 192.168.122.30 port 53704
Feb 16 17:13:49 compute-0 sshd-session[106306]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:13:49 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Feb 16 17:13:49 compute-0 systemd[1]: session-23.scope: Consumed 3min 8.339s CPU time.
Feb 16 17:13:49 compute-0 systemd-logind[821]: Session 23 logged out. Waiting for processes to exit.
Feb 16 17:13:49 compute-0 systemd-logind[821]: Removed session 23.
Feb 16 17:13:54 compute-0 sshd-session[161248]: Accepted publickey for zuul from 192.168.122.30 port 45152 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:13:54 compute-0 systemd-logind[821]: New session 24 of user zuul.
Feb 16 17:13:54 compute-0 systemd[1]: Started Session 24 of User zuul.
Feb 16 17:13:54 compute-0 sshd-session[161248]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:13:55 compute-0 python3.9[161401]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:13:56 compute-0 python3.9[161555]: ansible-ansible.builtin.service_facts Invoked
Feb 16 17:13:56 compute-0 network[161572]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 17:13:56 compute-0 network[161573]: 'network-scripts' will be removed from distribution in near future.
Feb 16 17:13:56 compute-0 network[161574]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 17:14:01 compute-0 anacron[39749]: Job `cron.daily' started
Feb 16 17:14:01 compute-0 anacron[39749]: Job `cron.daily' terminated
Feb 16 17:14:02 compute-0 sudo[161846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liytqnprowohqfrusnlsdchgrvjvkyyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262041.9260685-73-233384985003459/AnsiballZ_setup.py'
Feb 16 17:14:02 compute-0 sudo[161846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:02 compute-0 python3.9[161848]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 16 17:14:02 compute-0 sudo[161846]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:03 compute-0 sudo[161930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmdtmdistcavsoutwxapvrhmivjqfwrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262041.9260685-73-233384985003459/AnsiballZ_dnf.py'
Feb 16 17:14:03 compute-0 sudo[161930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:03 compute-0 python3.9[161932]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:14:08 compute-0 sudo[161930]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:09 compute-0 sudo[162083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-binkbglysckndgjqmtaaoyjefppesxwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262048.920917-97-198388044761068/AnsiballZ_stat.py'
Feb 16 17:14:09 compute-0 sudo[162083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:09 compute-0 python3.9[162085]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:14:09 compute-0 sudo[162083]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:10 compute-0 sudo[162235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-litjbwylsyqvoepjtevrkwniegjqmacl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262049.9869926-117-18419177632646/AnsiballZ_command.py'
Feb 16 17:14:10 compute-0 sudo[162235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:10 compute-0 python3.9[162237]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:14:10 compute-0 sudo[162235]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:11 compute-0 podman[162322]: 2026-02-16 17:14:11.131473731 +0000 UTC m=+0.102638509 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 17:14:11 compute-0 sudo[162407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmbtzaphrajhpvrkisjxevnxligqlkce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262050.9283226-137-161603977310961/AnsiballZ_stat.py'
Feb 16 17:14:11 compute-0 sudo[162407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:11 compute-0 python3.9[162409]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:14:11 compute-0 sudo[162407]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:12 compute-0 sudo[162559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlioloyofforrntafvejhyaslmtnqrpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262051.7310076-153-274382825637215/AnsiballZ_command.py'
Feb 16 17:14:12 compute-0 sudo[162559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:12 compute-0 python3.9[162561]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:14:12 compute-0 sudo[162559]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:12 compute-0 sudo[162712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvfbpeomozlgmzlmxnkkvqlarndniosv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262052.4116132-169-232564643716590/AnsiballZ_stat.py'
Feb 16 17:14:12 compute-0 sudo[162712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:12 compute-0 python3.9[162714]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:14:12 compute-0 sudo[162712]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:13 compute-0 sudo[162835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aweyrkpcuwvsforqiknousorfxprthjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262052.4116132-169-232564643716590/AnsiballZ_copy.py'
Feb 16 17:14:13 compute-0 sudo[162835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:13 compute-0 python3.9[162837]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262052.4116132-169-232564643716590/.source.iscsi _original_basename=._oz5_l67 follow=False checksum=95d9e6cd54522cbfc30a88b4aac4970eb627e7d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:13 compute-0 sudo[162835]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:14 compute-0 sudo[162987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyuhlyfxasdotbymyiewdflsocjrhjqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262053.6203518-199-101447278332988/AnsiballZ_file.py'
Feb 16 17:14:14 compute-0 sudo[162987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:14 compute-0 python3.9[162989]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:14 compute-0 sudo[162987]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:14 compute-0 sudo[163139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xetrlmdhxttjperxgrkvzzsegsrbxowz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262054.4917173-215-123794750405456/AnsiballZ_lineinfile.py'
Feb 16 17:14:14 compute-0 sudo[163139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:15 compute-0 python3.9[163141]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:15 compute-0 sudo[163139]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:15 compute-0 sudo[163291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flgttjltwjazqesrpsyqzvakxtxwtmsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262055.3250198-233-105297638233406/AnsiballZ_systemd_service.py'
Feb 16 17:14:15 compute-0 sudo[163291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:16 compute-0 python3.9[163293]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:14:17 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 16 17:14:17 compute-0 sudo[163291]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:17 compute-0 sudo[163461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypjacjncgahgzojqzfxufhcdkciaxkta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262057.415911-249-94344276358935/AnsiballZ_systemd_service.py'
Feb 16 17:14:17 compute-0 sudo[163461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:17 compute-0 podman[163421]: 2026-02-16 17:14:17.744069992 +0000 UTC m=+0.087366144 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 17:14:17 compute-0 python3.9[163467]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:14:18 compute-0 systemd[1]: Reloading.
Feb 16 17:14:18 compute-0 systemd-sysv-generator[163504]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:14:18 compute-0 systemd-rc-local-generator[163498]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:14:18 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 16 17:14:18 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 16 17:14:18 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Feb 16 17:14:18 compute-0 systemd[1]: Started Open-iSCSI.
Feb 16 17:14:18 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 16 17:14:18 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 16 17:14:18 compute-0 sudo[163461]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:19 compute-0 python3.9[163682]: ansible-ansible.builtin.service_facts Invoked
Feb 16 17:14:19 compute-0 network[163699]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 17:14:19 compute-0 network[163700]: 'network-scripts' will be removed from distribution in near future.
Feb 16 17:14:19 compute-0 network[163701]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 17:14:24 compute-0 sudo[163971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhothssonjkjtsubpimjchcudkdaxvti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262064.327271-295-140990185529787/AnsiballZ_dnf.py'
Feb 16 17:14:24 compute-0 sudo[163971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:24 compute-0 python3.9[163973]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:14:33 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 17:14:33 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 17:14:33 compute-0 systemd[1]: Reloading.
Feb 16 17:14:33 compute-0 systemd-rc-local-generator[164068]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:14:33 compute-0 systemd-sysv-generator[164071]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:14:33 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 17:14:33 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 17:14:33 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 17:14:33 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 17:14:33 compute-0 systemd[1]: run-r0486a90ccfb9448d8bf1ed09f29fe579.service: Deactivated successfully.
Feb 16 17:14:33 compute-0 systemd[1]: run-r47f1f128373841ca9d4bdd51f8ab3339.service: Deactivated successfully.
Feb 16 17:14:34 compute-0 sudo[163971]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:34 compute-0 sudo[164348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxacuqrcgtajzntlcketlwfrypdcifjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262074.4278953-313-247529485592364/AnsiballZ_file.py'
Feb 16 17:14:34 compute-0 sudo[164348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:34 compute-0 python3.9[164350]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 16 17:14:34 compute-0 sudo[164348]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:35 compute-0 sudo[164500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-horiahecgpltjzbqsxprwbmcbxiksesd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262075.1673822-329-266744290583149/AnsiballZ_modprobe.py'
Feb 16 17:14:35 compute-0 sudo[164500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:35 compute-0 python3.9[164502]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 16 17:14:35 compute-0 sudo[164500]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:36 compute-0 sudo[164656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdzlehagrpgmqncsnqwkzhgsrkothqvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262075.8971312-345-166882137430844/AnsiballZ_stat.py'
Feb 16 17:14:36 compute-0 sudo[164656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:36 compute-0 python3.9[164658]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:14:36 compute-0 sudo[164656]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:36 compute-0 sudo[164779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyjvefopeuifidcctxmrnoihmropusmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262075.8971312-345-166882137430844/AnsiballZ_copy.py'
Feb 16 17:14:36 compute-0 sudo[164779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:36 compute-0 python3.9[164781]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262075.8971312-345-166882137430844/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:36 compute-0 sudo[164779]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:37 compute-0 sudo[164931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhzxpliigmmltpxrjgrbsxnhzcikapwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262077.05901-377-226541155186927/AnsiballZ_lineinfile.py'
Feb 16 17:14:37 compute-0 sudo[164931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:37 compute-0 python3.9[164933]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:37 compute-0 sudo[164931]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:14:38.139 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:14:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:14:38.141 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:14:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:14:38.141 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:14:38 compute-0 sudo[165083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rppcfgqsxeymqpjhruvimjxmfjumztqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262077.633275-393-136702048772759/AnsiballZ_systemd.py'
Feb 16 17:14:38 compute-0 sudo[165083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:38 compute-0 python3.9[165085]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:14:38 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 16 17:14:38 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 16 17:14:38 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 16 17:14:38 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 16 17:14:38 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 16 17:14:38 compute-0 sudo[165083]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:39 compute-0 sudo[165239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-squcsqqzjzomclayyebmpelifonuxxup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262078.7782035-409-276422228687555/AnsiballZ_command.py'
Feb 16 17:14:39 compute-0 sudo[165239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:39 compute-0 python3.9[165241]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:14:39 compute-0 sudo[165239]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:39 compute-0 sudo[165392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbnqmpvirftvbpuvyhpqhoqzwvgxlmhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262079.4892285-429-127247665536479/AnsiballZ_stat.py'
Feb 16 17:14:39 compute-0 sudo[165392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:39 compute-0 python3.9[165394]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:14:39 compute-0 sudo[165392]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:40 compute-0 sudo[165544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmsxhkfwphdcnfyomcqelzwinxyjqfno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262080.1937876-447-200607982883359/AnsiballZ_stat.py'
Feb 16 17:14:40 compute-0 sudo[165544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:40 compute-0 python3.9[165546]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:14:40 compute-0 sudo[165544]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:40 compute-0 sudo[165667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bafnvnkxkwoglitowidphijkysybeeck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262080.1937876-447-200607982883359/AnsiballZ_copy.py'
Feb 16 17:14:40 compute-0 sudo[165667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:41 compute-0 python3.9[165669]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262080.1937876-447-200607982883359/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:41 compute-0 sudo[165667]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:41 compute-0 sudo[165831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahyszpzeapkaufxmpvzexjdoedxqhkvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262081.2494392-477-124503691058514/AnsiballZ_command.py'
Feb 16 17:14:41 compute-0 sudo[165831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:41 compute-0 podman[165793]: 2026-02-16 17:14:41.533596541 +0000 UTC m=+0.072019047 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 16 17:14:41 compute-0 python3.9[165840]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:14:41 compute-0 sudo[165831]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:42 compute-0 sudo[165991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgzqcvjsucyemzclyutpovpmepmcsldn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262081.8861997-493-261603676653390/AnsiballZ_lineinfile.py'
Feb 16 17:14:42 compute-0 sudo[165991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:42 compute-0 python3.9[165993]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:42 compute-0 sudo[165991]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:42 compute-0 sudo[166143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hivlzatsjrgdtiiqbiovaknrngrqwcbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262082.45923-509-21844265855759/AnsiballZ_replace.py'
Feb 16 17:14:42 compute-0 sudo[166143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:42 compute-0 python3.9[166145]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:43 compute-0 sudo[166143]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:43 compute-0 sudo[166295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxwrckcwvkexofxawqldhoqpqfhthkdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262083.1969306-525-149227507025797/AnsiballZ_replace.py'
Feb 16 17:14:43 compute-0 sudo[166295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:43 compute-0 python3.9[166297]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:43 compute-0 sudo[166295]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:44 compute-0 sudo[166447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tetiidpdvhpdslxpoehclwydeaugqujq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262083.7941012-543-12972617018105/AnsiballZ_lineinfile.py'
Feb 16 17:14:44 compute-0 sudo[166447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:44 compute-0 python3.9[166449]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:44 compute-0 sudo[166447]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:44 compute-0 sudo[166599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgdvvvhcgcfgiaoquqqkgwvvfcfmxrec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262084.3863716-543-107378079598416/AnsiballZ_lineinfile.py'
Feb 16 17:14:44 compute-0 sudo[166599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:44 compute-0 python3.9[166601]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:44 compute-0 sudo[166599]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:45 compute-0 sudo[166751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euepcqxtbeqxkbjrdniulcgaqlmlralq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262084.945294-543-238470520782923/AnsiballZ_lineinfile.py'
Feb 16 17:14:45 compute-0 sudo[166751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:45 compute-0 python3.9[166753]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:45 compute-0 sudo[166751]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:45 compute-0 sudo[166903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebckgjowjilznsftbhlbcxpilfnjdufa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262085.4946861-543-221110890867869/AnsiballZ_lineinfile.py'
Feb 16 17:14:45 compute-0 sudo[166903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:45 compute-0 python3.9[166905]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:45 compute-0 sudo[166903]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:46 compute-0 sudo[167055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zepipwhyjhodvhadslaxmwbbddlffjis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262086.0813825-601-90458307553450/AnsiballZ_stat.py'
Feb 16 17:14:46 compute-0 sudo[167055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:46 compute-0 python3.9[167057]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:14:46 compute-0 sudo[167055]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:47 compute-0 sudo[167209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tyxjwlcxktnovgajyaxkaexqhpdouric ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262086.8028765-617-191805334940932/AnsiballZ_command.py'
Feb 16 17:14:47 compute-0 sudo[167209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:47 compute-0 python3.9[167211]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:14:47 compute-0 sudo[167209]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:47 compute-0 sudo[167362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-susanxclqcyfyvkyfmsjlnmwdydyjchr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262087.5310638-635-29002409062621/AnsiballZ_systemd_service.py'
Feb 16 17:14:47 compute-0 sudo[167362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:47 compute-0 podman[167364]: 2026-02-16 17:14:47.893817001 +0000 UTC m=+0.084366511 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Feb 16 17:14:48 compute-0 python3.9[167365]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:14:48 compute-0 systemd[1]: Listening on multipathd control socket.
Feb 16 17:14:48 compute-0 sudo[167362]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:48 compute-0 sudo[167545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsbarvnqmztmfszeyomrbadbddoytkkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262088.3536136-651-94343875189483/AnsiballZ_systemd_service.py'
Feb 16 17:14:48 compute-0 sudo[167545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:48 compute-0 python3.9[167547]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:14:48 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 16 17:14:48 compute-0 udevadm[167552]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 16 17:14:49 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 16 17:14:49 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 16 17:14:49 compute-0 multipathd[167555]: --------start up--------
Feb 16 17:14:49 compute-0 multipathd[167555]: read /etc/multipath.conf
Feb 16 17:14:49 compute-0 multipathd[167555]: path checkers start up
Feb 16 17:14:49 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 16 17:14:49 compute-0 sudo[167545]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:49 compute-0 sudo[167713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlmmudsvclmxicnxnbijrflqyjtfedmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262089.511128-675-6020729883971/AnsiballZ_file.py'
Feb 16 17:14:49 compute-0 sudo[167713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:49 compute-0 python3.9[167715]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 16 17:14:49 compute-0 sudo[167713]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:50 compute-0 sudo[167865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biqgtynufdimiydfpqntqkcbsdfosxlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262090.2189512-691-166804189232870/AnsiballZ_modprobe.py'
Feb 16 17:14:50 compute-0 sudo[167865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:50 compute-0 python3.9[167867]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 16 17:14:50 compute-0 kernel: Key type psk registered
Feb 16 17:14:50 compute-0 sudo[167865]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:51 compute-0 sudo[168029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oytuamulhsagdbntxlvizikikqozvkmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262090.8777976-707-63113329342634/AnsiballZ_stat.py'
Feb 16 17:14:51 compute-0 sudo[168029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:51 compute-0 python3.9[168031]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:14:51 compute-0 sudo[168029]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:51 compute-0 sudo[168152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gukuprglktplmhbkwzyqhuwffcwlpptl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262090.8777976-707-63113329342634/AnsiballZ_copy.py'
Feb 16 17:14:51 compute-0 sudo[168152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:51 compute-0 python3.9[168154]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262090.8777976-707-63113329342634/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:51 compute-0 sudo[168152]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:52 compute-0 sudo[168304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wytrinscrdyfplmoqazssiasltlpcmnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262092.0552547-739-84925115674974/AnsiballZ_lineinfile.py'
Feb 16 17:14:52 compute-0 sudo[168304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:52 compute-0 python3.9[168306]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:14:52 compute-0 sudo[168304]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:52 compute-0 sudo[168456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akyowltdlrggkondcknnxoldbvykupng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262092.6795506-755-5751513916542/AnsiballZ_systemd.py'
Feb 16 17:14:52 compute-0 sudo[168456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:53 compute-0 python3.9[168458]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:14:53 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 16 17:14:53 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 16 17:14:53 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 16 17:14:53 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 16 17:14:53 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 16 17:14:53 compute-0 sudo[168456]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:54 compute-0 sudo[168612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-douorxccfprxydngthmqvvubxhevtbkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262093.7149167-771-104252684528750/AnsiballZ_dnf.py'
Feb 16 17:14:54 compute-0 sudo[168612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:54 compute-0 python3.9[168614]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 16 17:14:56 compute-0 systemd[1]: Reloading.
Feb 16 17:14:56 compute-0 systemd-sysv-generator[168650]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:14:56 compute-0 systemd-rc-local-generator[168647]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:14:56 compute-0 systemd[1]: Reloading.
Feb 16 17:14:56 compute-0 systemd-sysv-generator[168691]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:14:56 compute-0 systemd-rc-local-generator[168687]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:14:56 compute-0 virtnodedevd[155180]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 16 17:14:56 compute-0 virtnodedevd[155180]: hostname: compute-0
Feb 16 17:14:56 compute-0 virtnodedevd[155180]: nl_recv returned with error: No buffer space available
Feb 16 17:14:56 compute-0 systemd-logind[821]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 16 17:14:56 compute-0 systemd-logind[821]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 16 17:14:57 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 16 17:14:57 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 16 17:14:57 compute-0 systemd[1]: Reloading.
Feb 16 17:14:57 compute-0 systemd-sysv-generator[168795]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:14:57 compute-0 systemd-rc-local-generator[168792]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:14:57 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 16 17:14:57 compute-0 sudo[168612]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:58 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 16 17:14:58 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 16 17:14:58 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.044s CPU time.
Feb 16 17:14:58 compute-0 systemd[1]: run-r3f64353012174d138d304dc536ad2b2e.service: Deactivated successfully.
Feb 16 17:14:58 compute-0 sudo[170102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oounfwbuokgnzjxzwalcpotviddqdfdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262098.3426366-787-144985837870678/AnsiballZ_systemd_service.py'
Feb 16 17:14:58 compute-0 sudo[170102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:58 compute-0 python3.9[170104]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:14:58 compute-0 systemd[1]: Stopping Open-iSCSI...
Feb 16 17:14:58 compute-0 iscsid[163524]: iscsid shutting down.
Feb 16 17:14:58 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Feb 16 17:14:58 compute-0 systemd[1]: Stopped Open-iSCSI.
Feb 16 17:14:58 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 16 17:14:58 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 16 17:14:58 compute-0 systemd[1]: Started Open-iSCSI.
Feb 16 17:14:59 compute-0 sudo[170102]: pam_unix(sudo:session): session closed for user root
Feb 16 17:14:59 compute-0 sudo[170258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsckjrmkxdifuxlxdgqrjrabmmxzgvxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262099.1718004-803-196938409634893/AnsiballZ_systemd_service.py'
Feb 16 17:14:59 compute-0 sudo[170258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:14:59 compute-0 python3.9[170260]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:14:59 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 16 17:14:59 compute-0 multipathd[167555]: exit (signal)
Feb 16 17:14:59 compute-0 multipathd[167555]: --------shut down-------
Feb 16 17:14:59 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Feb 16 17:14:59 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 16 17:14:59 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 16 17:14:59 compute-0 multipathd[170266]: --------start up--------
Feb 16 17:14:59 compute-0 multipathd[170266]: read /etc/multipath.conf
Feb 16 17:14:59 compute-0 multipathd[170266]: path checkers start up
Feb 16 17:14:59 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 16 17:14:59 compute-0 sudo[170258]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:00 compute-0 python3.9[170424]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:15:01 compute-0 sudo[170578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cibmctcnthwmlmkzqvhlwrbzgmnmhfif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262101.171358-838-69559671938489/AnsiballZ_file.py'
Feb 16 17:15:01 compute-0 sudo[170578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:01 compute-0 python3.9[170580]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:01 compute-0 sudo[170578]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:02 compute-0 sudo[170730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iogxiyttwzciqfjyrzyzcrfqtydbrtez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262102.1144485-860-192709741157061/AnsiballZ_systemd_service.py'
Feb 16 17:15:02 compute-0 sudo[170730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:02 compute-0 python3.9[170732]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:15:02 compute-0 systemd[1]: Reloading.
Feb 16 17:15:02 compute-0 systemd-sysv-generator[170769]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:15:02 compute-0 systemd-rc-local-generator[170764]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:15:03 compute-0 sudo[170730]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:03 compute-0 python3.9[170925]: ansible-ansible.builtin.service_facts Invoked
Feb 16 17:15:03 compute-0 network[170942]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 17:15:03 compute-0 network[170943]: 'network-scripts' will be removed from distribution in near future.
Feb 16 17:15:03 compute-0 network[170944]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 17:15:06 compute-0 sudo[171215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udbxvamhcgrzsgifafjsmpnskqbirvlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262106.508723-898-118210500233011/AnsiballZ_systemd_service.py'
Feb 16 17:15:06 compute-0 sudo[171215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:07 compute-0 python3.9[171217]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:15:07 compute-0 sudo[171215]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:08 compute-0 sudo[171368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnnuoqozwtnyyedoycgdwwmkrwkkksoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262107.7451708-898-100507010743845/AnsiballZ_systemd_service.py'
Feb 16 17:15:08 compute-0 sudo[171368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:08 compute-0 python3.9[171370]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:15:08 compute-0 sudo[171368]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:08 compute-0 sudo[171521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngdmyobspwlpkiumgjeahzpqkrrawjdz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262108.4705055-898-8675504922074/AnsiballZ_systemd_service.py'
Feb 16 17:15:08 compute-0 sudo[171521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:08 compute-0 python3.9[171523]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:15:09 compute-0 sudo[171521]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:09 compute-0 sudo[171674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnsgzgmctvjzgewdcqkmfgeatkngawas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262109.1642346-898-46539878418574/AnsiballZ_systemd_service.py'
Feb 16 17:15:09 compute-0 sudo[171674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:09 compute-0 python3.9[171676]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:15:09 compute-0 sudo[171674]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:10 compute-0 sudo[171827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcaygludkohvcwcmgyvqalpupkedgaxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262109.868408-898-100966435633293/AnsiballZ_systemd_service.py'
Feb 16 17:15:10 compute-0 sudo[171827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:10 compute-0 python3.9[171829]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:15:10 compute-0 sudo[171827]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:10 compute-0 sudo[171980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upxevedotbziifxktnwdjzhaemhkbkxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262110.643504-898-43600609072533/AnsiballZ_systemd_service.py'
Feb 16 17:15:10 compute-0 sudo[171980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:11 compute-0 python3.9[171982]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:15:11 compute-0 sudo[171980]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:11 compute-0 sudo[172143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcdwolvzitbcfjlfgyilyuxzhuwtjjpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262111.3393993-898-77316209634137/AnsiballZ_systemd_service.py'
Feb 16 17:15:11 compute-0 sudo[172143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:11 compute-0 podman[172107]: 2026-02-16 17:15:11.674225259 +0000 UTC m=+0.077694447 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 16 17:15:11 compute-0 python3.9[172147]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:15:11 compute-0 sudo[172143]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:12 compute-0 sudo[172304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjcrkynobfwulydrfiesehewfgqicxbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262112.0609539-898-194191143270569/AnsiballZ_systemd_service.py'
Feb 16 17:15:12 compute-0 sudo[172304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:12 compute-0 python3.9[172306]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:15:12 compute-0 sudo[172304]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:13 compute-0 sudo[172457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyuhriodcpkojtnkholkjdqnqgkmboci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262113.0216289-1016-214916881800371/AnsiballZ_file.py'
Feb 16 17:15:13 compute-0 sudo[172457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:13 compute-0 python3.9[172459]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:13 compute-0 sudo[172457]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:13 compute-0 sudo[172610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvsryusjsnzecoyhcvwcffucrhlhppwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262113.6425407-1016-111679275228931/AnsiballZ_file.py'
Feb 16 17:15:13 compute-0 sudo[172610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:14 compute-0 python3.9[172612]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:14 compute-0 sudo[172610]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:14 compute-0 sudo[172762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grsiwoeufjfbpqsgvcogdfaakleggyyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262114.2362204-1016-218090304918967/AnsiballZ_file.py'
Feb 16 17:15:14 compute-0 sudo[172762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:14 compute-0 python3.9[172764]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:14 compute-0 sudo[172762]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:15 compute-0 sudo[172914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guzcfyqakduvsddkgyyflyhnxjwrmlsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262114.7971966-1016-11827934135671/AnsiballZ_file.py'
Feb 16 17:15:15 compute-0 sudo[172914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:15 compute-0 python3.9[172916]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:15 compute-0 sudo[172914]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:15 compute-0 sudo[173066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tykxbkhihulhpbfaxyvxllxgfgledizz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262115.4133565-1016-138834472351970/AnsiballZ_file.py'
Feb 16 17:15:15 compute-0 sudo[173066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:15 compute-0 python3.9[173068]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:15 compute-0 sudo[173066]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:16 compute-0 sudo[173218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvpmjchcovwnqlnrdqnxqjxrxpaewsrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262116.0324721-1016-94915725216018/AnsiballZ_file.py'
Feb 16 17:15:16 compute-0 sudo[173218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:16 compute-0 python3.9[173220]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:16 compute-0 sudo[173218]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:17 compute-0 sudo[173370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmgwbhskrnwfrfdxajiapdsazopipyte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262116.7577996-1016-186522988516903/AnsiballZ_file.py'
Feb 16 17:15:17 compute-0 sudo[173370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:17 compute-0 python3.9[173372]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:17 compute-0 sudo[173370]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:17 compute-0 sudo[173522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wndostutymmypopjxbqxhobfvybdnfdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262117.3142748-1016-139458985433431/AnsiballZ_file.py'
Feb 16 17:15:17 compute-0 sudo[173522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:17 compute-0 python3.9[173524]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:17 compute-0 sudo[173522]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:18 compute-0 podman[173600]: 2026-02-16 17:15:18.130004392 +0000 UTC m=+0.102644466 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Feb 16 17:15:18 compute-0 sudo[173701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duebrvwzobwovbmlbqghbqhthakdbpfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262117.9519162-1130-160817423015742/AnsiballZ_file.py'
Feb 16 17:15:18 compute-0 sudo[173701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:18 compute-0 python3.9[173703]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:18 compute-0 sudo[173701]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:18 compute-0 sudo[173853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhmixnaupgtofohtnwzhqjefqldgjbfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262118.5440361-1130-15726073444574/AnsiballZ_file.py'
Feb 16 17:15:18 compute-0 sudo[173853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:18 compute-0 python3.9[173855]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:18 compute-0 sudo[173853]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:19 compute-0 sudo[174005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbchkqdpmfxncrojybedjetjzhplokox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262119.115893-1130-90342161547422/AnsiballZ_file.py'
Feb 16 17:15:19 compute-0 sudo[174005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:19 compute-0 python3.9[174007]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:19 compute-0 sudo[174005]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:19 compute-0 sudo[174157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sewblazwtxrezseeaejpdtjzjiozcyww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262119.6786919-1130-163954725303347/AnsiballZ_file.py'
Feb 16 17:15:19 compute-0 sudo[174157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:20 compute-0 python3.9[174159]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:20 compute-0 sudo[174157]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:20 compute-0 sudo[174309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqaivuxksjyidnstuxumcjtcypphsdvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262120.2719188-1130-122959371999245/AnsiballZ_file.py'
Feb 16 17:15:20 compute-0 sudo[174309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:20 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 16 17:15:20 compute-0 python3.9[174311]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:20 compute-0 sudo[174309]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:21 compute-0 sudo[174462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htsgpflhoycrhxzvpyqagnysfzeezteu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262120.8421874-1130-145897556880054/AnsiballZ_file.py'
Feb 16 17:15:21 compute-0 sudo[174462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:21 compute-0 python3.9[174464]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:21 compute-0 sudo[174462]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:21 compute-0 sudo[174614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hefzutmcxpyylcjvkhseeunecvfboszg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262121.4577096-1130-29529759525878/AnsiballZ_file.py'
Feb 16 17:15:21 compute-0 sudo[174614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:21 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 17:15:21 compute-0 python3.9[174616]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:21 compute-0 sudo[174614]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:22 compute-0 sudo[174767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypavubpycrrgghtriziuzkwtfgkgqikp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262122.0259194-1130-260906542565778/AnsiballZ_file.py'
Feb 16 17:15:22 compute-0 sudo[174767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:22 compute-0 python3.9[174769]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:22 compute-0 sudo[174767]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:22 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Feb 16 17:15:22 compute-0 sudo[174920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcqcdmjdayqkyslhdlcjvdupzlgbbicd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262122.7313747-1246-85722302568699/AnsiballZ_command.py'
Feb 16 17:15:22 compute-0 sudo[174920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:23 compute-0 python3.9[174922]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:15:23 compute-0 sudo[174920]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:23 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 16 17:15:23 compute-0 python3.9[175074]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 17:15:24 compute-0 sudo[175225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tejsoeumrncmrcjbjqkegcjcysvngqiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262124.221311-1282-186037654552575/AnsiballZ_systemd_service.py'
Feb 16 17:15:24 compute-0 sudo[175225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:24 compute-0 python3.9[175227]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:15:24 compute-0 systemd[1]: Reloading.
Feb 16 17:15:24 compute-0 systemd-rc-local-generator[175251]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:15:24 compute-0 systemd-sysv-generator[175260]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:15:25 compute-0 sudo[175225]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:25 compute-0 sudo[175420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lecctbomkztonhhynohsngwschwrogdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262125.1847217-1298-275331915420622/AnsiballZ_command.py'
Feb 16 17:15:25 compute-0 sudo[175420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:25 compute-0 python3.9[175422]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:15:25 compute-0 sudo[175420]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:26 compute-0 sudo[175573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvdrwgpgpbwtkwpqvvdjuhtsovgnjhwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262125.8191645-1298-132011790181177/AnsiballZ_command.py'
Feb 16 17:15:26 compute-0 sudo[175573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:26 compute-0 python3.9[175575]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:15:26 compute-0 sudo[175573]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:26 compute-0 sudo[175726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qizmzetzqbwyyjcndjkpkvvqiiaarxgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262126.412031-1298-41530063864594/AnsiballZ_command.py'
Feb 16 17:15:26 compute-0 sudo[175726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:26 compute-0 python3.9[175728]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:15:26 compute-0 sudo[175726]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:27 compute-0 sudo[175879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwuafmcyhkhlmrfaccuqwupgdkvpxiby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262126.9312286-1298-267018604010552/AnsiballZ_command.py'
Feb 16 17:15:27 compute-0 sudo[175879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:27 compute-0 python3.9[175881]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:15:27 compute-0 sudo[175879]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:27 compute-0 sudo[176032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvbcvylivgvrsixvqucaltlcccgpjfkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262127.5411763-1298-89666800584579/AnsiballZ_command.py'
Feb 16 17:15:27 compute-0 sudo[176032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:27 compute-0 python3.9[176034]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:15:27 compute-0 sudo[176032]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:28 compute-0 sudo[176185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqjaftshgncukyfpbzzqjcowjqkdpens ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262128.1129832-1298-48193434739422/AnsiballZ_command.py'
Feb 16 17:15:28 compute-0 sudo[176185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:28 compute-0 python3.9[176187]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:15:28 compute-0 sudo[176185]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:28 compute-0 sudo[176338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtbjtkutxbubvsasxgedsjpastioioxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262128.6822095-1298-201325103378364/AnsiballZ_command.py'
Feb 16 17:15:28 compute-0 sudo[176338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:29 compute-0 python3.9[176340]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:15:29 compute-0 sudo[176338]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:29 compute-0 sudo[176491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivywapscsvdlmiurstskbioaarbpwnem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262129.3283963-1298-206994661979015/AnsiballZ_command.py'
Feb 16 17:15:29 compute-0 sudo[176491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:29 compute-0 python3.9[176493]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:15:29 compute-0 sudo[176491]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:31 compute-0 sudo[176644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoxeqtoxcavareouadudaghcfkrfeium ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262130.7881746-1441-158003427435456/AnsiballZ_file.py'
Feb 16 17:15:31 compute-0 sudo[176644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:31 compute-0 python3.9[176646]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:31 compute-0 sudo[176644]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:31 compute-0 sudo[176796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbhpcuotmkulepilrcayiusnqelqjyhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262131.4059455-1441-243787995431740/AnsiballZ_file.py'
Feb 16 17:15:31 compute-0 sudo[176796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:31 compute-0 python3.9[176798]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:31 compute-0 sudo[176796]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:32 compute-0 sudo[176948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duckphhgpnbimxyfzgebtzxqigmhryqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262132.0072637-1471-194570973841738/AnsiballZ_file.py'
Feb 16 17:15:32 compute-0 sudo[176948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:32 compute-0 python3.9[176950]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:32 compute-0 sudo[176948]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:32 compute-0 sudo[177100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wufnvxendjqwzmiqaaggyxexrnwkprid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262132.6255774-1471-36404636096191/AnsiballZ_file.py'
Feb 16 17:15:32 compute-0 sudo[177100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:33 compute-0 python3.9[177102]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:33 compute-0 sudo[177100]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:33 compute-0 sudo[177252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjykizzkcdelzwewozzvdzjlcnaqvtca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262133.2169664-1471-9694114643257/AnsiballZ_file.py'
Feb 16 17:15:33 compute-0 sudo[177252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:33 compute-0 python3.9[177254]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:33 compute-0 sudo[177252]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:34 compute-0 sudo[177404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwmeszsxnmoyalwtupjtuwqnucjxipje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262133.7794387-1471-220781833869200/AnsiballZ_file.py'
Feb 16 17:15:34 compute-0 sudo[177404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:34 compute-0 python3.9[177406]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:34 compute-0 sudo[177404]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:34 compute-0 sudo[177556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruutnpucykfrfjkuhvdifaveuxkladyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262134.3853924-1471-15098142296614/AnsiballZ_file.py'
Feb 16 17:15:34 compute-0 sudo[177556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:34 compute-0 python3.9[177558]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:34 compute-0 sudo[177556]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:35 compute-0 sudo[177708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaazwthjjxyukkusclvldzvnjivjtjgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262134.972133-1471-92638907432308/AnsiballZ_file.py'
Feb 16 17:15:35 compute-0 sudo[177708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:35 compute-0 python3.9[177710]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:35 compute-0 sudo[177708]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:35 compute-0 sudo[177860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prluryngfzjcptajnhdidfwantghhapx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262135.5429845-1471-272256423853197/AnsiballZ_file.py'
Feb 16 17:15:35 compute-0 sudo[177860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:35 compute-0 python3.9[177862]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:36 compute-0 sudo[177860]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:15:38.140 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:15:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:15:38.142 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:15:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:15:38.142 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:15:40 compute-0 sudo[178012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnpuzugeojtqzltitdmijunxylooipbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262140.0760174-1708-94247936618258/AnsiballZ_getent.py'
Feb 16 17:15:40 compute-0 sudo[178012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:40 compute-0 python3.9[178014]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 16 17:15:40 compute-0 sudo[178012]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:41 compute-0 sudo[178165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rybpgyoafenknmemzlfwjlftmcjattxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262140.900675-1724-163010752009711/AnsiballZ_group.py'
Feb 16 17:15:41 compute-0 sudo[178165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:41 compute-0 python3.9[178167]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 17:15:41 compute-0 groupadd[178168]: group added to /etc/group: name=nova, GID=42436
Feb 16 17:15:41 compute-0 groupadd[178168]: group added to /etc/gshadow: name=nova
Feb 16 17:15:41 compute-0 groupadd[178168]: new group: name=nova, GID=42436
Feb 16 17:15:41 compute-0 sudo[178165]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:42 compute-0 podman[178250]: 2026-02-16 17:15:42.133211705 +0000 UTC m=+0.096403902 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 16 17:15:42 compute-0 sudo[178344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtajwsnmcwhvjllkzgjughdlapjallbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262141.8983057-1740-70547823329292/AnsiballZ_user.py'
Feb 16 17:15:42 compute-0 sudo[178344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:42 compute-0 python3.9[178346]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 17:15:42 compute-0 useradd[178348]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Feb 16 17:15:42 compute-0 useradd[178348]: add 'nova' to group 'libvirt'
Feb 16 17:15:42 compute-0 useradd[178348]: add 'nova' to shadow group 'libvirt'
Feb 16 17:15:42 compute-0 sudo[178344]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:43 compute-0 sshd-session[178379]: Accepted publickey for zuul from 192.168.122.30 port 46588 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:15:43 compute-0 systemd-logind[821]: New session 25 of user zuul.
Feb 16 17:15:43 compute-0 systemd[1]: Started Session 25 of User zuul.
Feb 16 17:15:43 compute-0 sshd-session[178379]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:15:43 compute-0 sshd-session[178382]: Received disconnect from 192.168.122.30 port 46588:11: disconnected by user
Feb 16 17:15:43 compute-0 sshd-session[178382]: Disconnected from user zuul 192.168.122.30 port 46588
Feb 16 17:15:43 compute-0 sshd-session[178379]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:15:43 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Feb 16 17:15:43 compute-0 systemd-logind[821]: Session 25 logged out. Waiting for processes to exit.
Feb 16 17:15:43 compute-0 systemd-logind[821]: Removed session 25.
Feb 16 17:15:44 compute-0 python3.9[178532]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:15:44 compute-0 python3.9[178608]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:45 compute-0 python3.9[178758]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:15:45 compute-0 python3.9[178879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771262145.1008267-1790-95083049743582/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:46 compute-0 python3.9[179029]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:15:47 compute-0 python3.9[179150]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771262146.1407678-1790-197896855037991/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:47 compute-0 python3.9[179300]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:15:48 compute-0 python3.9[179421]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771262147.1678782-1790-60537835688545/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:48 compute-0 podman[179545]: 2026-02-16 17:15:48.51229629 +0000 UTC m=+0.079862961 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 17:15:48 compute-0 python3.9[179584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:15:49 compute-0 python3.9[179718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771262148.2022963-1898-1978950592408/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:49 compute-0 sudo[179868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvjfvjgvslouvdqpflyafowhkwqagmjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262149.313496-1928-21748464930840/AnsiballZ_file.py'
Feb 16 17:15:49 compute-0 sudo[179868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:49 compute-0 python3.9[179870]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:49 compute-0 sudo[179868]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:50 compute-0 sudo[180020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exyqauqedyviqevmihidtzshflynfyra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262149.9346104-1944-188250335085635/AnsiballZ_copy.py'
Feb 16 17:15:50 compute-0 sudo[180020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:50 compute-0 python3.9[180022]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:50 compute-0 sudo[180020]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:50 compute-0 sudo[180172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgvtgjuxgwrjsaigxplekxjcaozvdnfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262150.5260334-1960-87059965081835/AnsiballZ_stat.py'
Feb 16 17:15:50 compute-0 sudo[180172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:50 compute-0 python3.9[180174]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:15:50 compute-0 sudo[180172]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:51 compute-0 sudo[180324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uejgskqttibyvyqvgqefvlablinzjdxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262151.141991-1976-157764553973696/AnsiballZ_stat.py'
Feb 16 17:15:51 compute-0 sudo[180324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:51 compute-0 python3.9[180326]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:15:51 compute-0 sudo[180324]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:51 compute-0 sudo[180447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aerycbzcthxvipmyyiyetckxjmvbsncx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262151.141991-1976-157764553973696/AnsiballZ_copy.py'
Feb 16 17:15:51 compute-0 sudo[180447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:52 compute-0 python3.9[180449]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1771262151.141991-1976-157764553973696/.source _original_basename=.bdkzgfl9 follow=False checksum=346b11ffebbce17e2660dae7ab6b869cfae6fa8b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Feb 16 17:15:52 compute-0 sudo[180447]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:52 compute-0 python3.9[180601]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:15:53 compute-0 sudo[180753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-numgpckgzvcawxlvpmnytidhllacgxwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262153.2917643-2032-193610131101533/AnsiballZ_file.py'
Feb 16 17:15:53 compute-0 sudo[180753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:53 compute-0 python3.9[180755]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:53 compute-0 sudo[180753]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:54 compute-0 sudo[180905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbttrktyytjqkelddpfsicpyofxzebfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262153.9651308-2048-199128718914702/AnsiballZ_file.py'
Feb 16 17:15:54 compute-0 sudo[180905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:54 compute-0 python3.9[180907]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:15:54 compute-0 sudo[180905]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:55 compute-0 python3.9[181057]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:15:57 compute-0 sudo[181478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzgfnmpteoscvrxsjdqsclnimexkgmhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262156.5876632-2116-211831189056408/AnsiballZ_container_config_data.py'
Feb 16 17:15:57 compute-0 sudo[181478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:57 compute-0 python3.9[181480]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 16 17:15:57 compute-0 sudo[181478]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:57 compute-0 sudo[181630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxuhdjkjfqxcsyptrariffepfpwftuea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262157.5060134-2138-184605531696241/AnsiballZ_container_config_hash.py'
Feb 16 17:15:57 compute-0 sudo[181630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:58 compute-0 python3.9[181632]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 17:15:58 compute-0 sudo[181630]: pam_unix(sudo:session): session closed for user root
Feb 16 17:15:58 compute-0 sudo[181782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yipidcdopdnyvlldmpviqzztksmyzhja ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771262158.4074585-2158-237868309898573/AnsiballZ_edpm_container_manage.py'
Feb 16 17:15:58 compute-0 sudo[181782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:15:59 compute-0 python3[181784]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 17:15:59 compute-0 podman[181820]: 2026-02-16 17:15:59.321329349 +0000 UTC m=+0.050209187 container create 2c036256c52622a206c9db1c170805517a68d3619b6ccf11395393261d9c0fdb (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute_init, container_name=nova_compute_init, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS)
Feb 16 17:15:59 compute-0 podman[181820]: 2026-02-16 17:15:59.295245122 +0000 UTC m=+0.024124940 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 16 17:15:59 compute-0 python3[181784]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409 --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 16 17:15:59 compute-0 sudo[181782]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:00 compute-0 sudo[182006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lujqxmoigvhicipizlckdsmqmkyewufb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262160.471347-2174-71121594435381/AnsiballZ_stat.py'
Feb 16 17:16:00 compute-0 sudo[182006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:00 compute-0 python3.9[182008]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:16:00 compute-0 sudo[182006]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:02 compute-0 python3.9[182160]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 17:16:02 compute-0 sudo[182310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dluxmnnpecocembpfjfpzjmezwjapops ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262162.6962714-2228-50509812433942/AnsiballZ_stat.py'
Feb 16 17:16:02 compute-0 sudo[182310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:03 compute-0 python3.9[182312]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:03 compute-0 sudo[182310]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:03 compute-0 sudo[182435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rikqrtijthsqdkpgrahyntuubsfojfbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262162.6962714-2228-50509812433942/AnsiballZ_copy.py'
Feb 16 17:16:03 compute-0 sudo[182435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:03 compute-0 python3.9[182437]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262162.6962714-2228-50509812433942/.source.yaml _original_basename=.7m1xiuig follow=False checksum=9e1dad9238814f619e0baaa0a4637491805ae4ed backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:03 compute-0 sudo[182435]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:04 compute-0 sudo[182587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llcogabrvddwigkxbvcsmdokbynseaza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262164.0903156-2262-11207218194560/AnsiballZ_file.py'
Feb 16 17:16:04 compute-0 sudo[182587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:04 compute-0 python3.9[182589]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:04 compute-0 sudo[182587]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:04 compute-0 sudo[182739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svsbjfxuxjwydttetjtjdbbtypexjbei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262164.7445273-2278-188440208569223/AnsiballZ_file.py'
Feb 16 17:16:04 compute-0 sudo[182739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:05 compute-0 python3.9[182741]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:16:05 compute-0 sudo[182739]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:05 compute-0 sudo[182891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxprsusnnxlpgxspbtpeabjybioskmph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262165.375828-2294-733202123690/AnsiballZ_stat.py'
Feb 16 17:16:05 compute-0 sudo[182891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:05 compute-0 python3.9[182893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:05 compute-0 sudo[182891]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:06 compute-0 sudo[183014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpvpeixatclglqhbksmrchlylmzwyksa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262165.375828-2294-733202123690/AnsiballZ_copy.py'
Feb 16 17:16:06 compute-0 sudo[183014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:06 compute-0 python3.9[183016]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262165.375828-2294-733202123690/.source.json _original_basename=.bf9rquwu follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:06 compute-0 sudo[183014]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:06 compute-0 python3.9[183166]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:09 compute-0 sudo[183587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upychicjsrczijfwyidlcnlyoyuhmodp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262168.8669233-2374-122159370488481/AnsiballZ_container_config_data.py'
Feb 16 17:16:09 compute-0 sudo[183587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:09 compute-0 python3.9[183589]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 16 17:16:09 compute-0 sudo[183587]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:10 compute-0 sudo[183739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plzgrefxixocjjdraoqlpqoywqjmgvfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262169.7755203-2396-27996498313096/AnsiballZ_container_config_hash.py'
Feb 16 17:16:10 compute-0 sudo[183739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:10 compute-0 python3.9[183741]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 17:16:10 compute-0 sudo[183739]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:10 compute-0 sudo[183891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlrhigrqadtbjgztegldlqrxzlqbejjj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771262170.57218-2416-184921726543555/AnsiballZ_edpm_container_manage.py'
Feb 16 17:16:10 compute-0 sudo[183891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:11 compute-0 python3[183893]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 17:16:11 compute-0 podman[183931]: 2026-02-16 17:16:11.220206083 +0000 UTC m=+0.051969370 container create 14cb986004037f98982308929308e47e5959fca75770a482149a3f55897e552b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:16:11 compute-0 podman[183931]: 2026-02-16 17:16:11.192607268 +0000 UTC m=+0.024370575 image pull f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 16 17:16:11 compute-0 python3[183893]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409 --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 16 17:16:11 compute-0 sudo[183891]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:11 compute-0 sudo[184119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czxrujpfklsiwotllliciwdpbullaeuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262171.5011575-2432-208617826095241/AnsiballZ_stat.py'
Feb 16 17:16:11 compute-0 sudo[184119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:11 compute-0 python3.9[184121]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:16:11 compute-0 sudo[184119]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:12 compute-0 sudo[184286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtcbtgdbxndptvepeiziqmvyhmvrkqqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262172.180191-2450-14580046884068/AnsiballZ_file.py'
Feb 16 17:16:12 compute-0 sudo[184286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:12 compute-0 podman[184247]: 2026-02-16 17:16:12.431301615 +0000 UTC m=+0.050801820 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 17:16:12 compute-0 python3.9[184294]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:12 compute-0 sudo[184286]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:12 compute-0 sudo[184368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-allcexvlikqpjrykrruvpwpotngtogrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262172.180191-2450-14580046884068/AnsiballZ_stat.py'
Feb 16 17:16:12 compute-0 sudo[184368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:13 compute-0 python3.9[184370]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:16:13 compute-0 sudo[184368]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:13 compute-0 sudo[184519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njurpdenavjninlsmbevtxxdvgkoauih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262173.1689887-2450-157168563558139/AnsiballZ_copy.py'
Feb 16 17:16:13 compute-0 sudo[184519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:13 compute-0 python3.9[184521]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771262173.1689887-2450-157168563558139/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:13 compute-0 sudo[184519]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:13 compute-0 sudo[184595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuolyprftjyuofkwgmjkkaetpzrudmuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262173.1689887-2450-157168563558139/AnsiballZ_systemd.py'
Feb 16 17:16:13 compute-0 sudo[184595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:14 compute-0 python3.9[184597]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:16:14 compute-0 systemd[1]: Reloading.
Feb 16 17:16:14 compute-0 systemd-rc-local-generator[184621]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:16:14 compute-0 systemd-sysv-generator[184627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:16:14 compute-0 sudo[184595]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:14 compute-0 sudo[184713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-talmmvyvkguflzfbmakckebxyjvrsmzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262173.1689887-2450-157168563558139/AnsiballZ_systemd.py'
Feb 16 17:16:14 compute-0 sudo[184713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:15 compute-0 python3.9[184715]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:16:15 compute-0 systemd[1]: Reloading.
Feb 16 17:16:15 compute-0 systemd-rc-local-generator[184749]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:16:15 compute-0 systemd-sysv-generator[184753]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:16:15 compute-0 systemd[1]: Starting nova_compute container...
Feb 16 17:16:15 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:16:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:15 compute-0 podman[184763]: 2026-02-16 17:16:15.594494335 +0000 UTC m=+0.113407623 container init 14cb986004037f98982308929308e47e5959fca75770a482149a3f55897e552b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_id=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:16:15 compute-0 podman[184763]: 2026-02-16 17:16:15.600309979 +0000 UTC m=+0.119223287 container start 14cb986004037f98982308929308e47e5959fca75770a482149a3f55897e552b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=nova_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 17:16:15 compute-0 podman[184763]: nova_compute
Feb 16 17:16:15 compute-0 nova_compute[184779]: + sudo -E kolla_set_configs
Feb 16 17:16:15 compute-0 systemd[1]: Started nova_compute container.
Feb 16 17:16:15 compute-0 sudo[184713]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Validating config file
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Copying service configuration files
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Deleting /etc/ceph
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Creating directory /etc/ceph
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /etc/ceph
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Writing out command to execute
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 17:16:15 compute-0 nova_compute[184779]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 17:16:15 compute-0 nova_compute[184779]: ++ cat /run_command
Feb 16 17:16:15 compute-0 nova_compute[184779]: + CMD=nova-compute
Feb 16 17:16:15 compute-0 nova_compute[184779]: + ARGS=
Feb 16 17:16:15 compute-0 nova_compute[184779]: + sudo kolla_copy_cacerts
Feb 16 17:16:15 compute-0 nova_compute[184779]: + [[ ! -n '' ]]
Feb 16 17:16:15 compute-0 nova_compute[184779]: + . kolla_extend_start
Feb 16 17:16:15 compute-0 nova_compute[184779]: Running command: 'nova-compute'
Feb 16 17:16:15 compute-0 nova_compute[184779]: + echo 'Running command: '\''nova-compute'\'''
Feb 16 17:16:15 compute-0 nova_compute[184779]: + umask 0022
Feb 16 17:16:15 compute-0 nova_compute[184779]: + exec nova-compute
Feb 16 17:16:16 compute-0 python3.9[184940]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 17:16:17 compute-0 sudo[185091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plovoixprjzhxwtkdkaawrqbrigtndzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262176.9867544-2540-279187913157649/AnsiballZ_stat.py'
Feb 16 17:16:17 compute-0 sudo[185091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:17 compute-0 python3.9[185093]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:17 compute-0 sudo[185091]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:17 compute-0 nova_compute[184779]: 2026-02-16 17:16:17.635 184783 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 17:16:17 compute-0 nova_compute[184779]: 2026-02-16 17:16:17.636 184783 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 17:16:17 compute-0 nova_compute[184779]: 2026-02-16 17:16:17.636 184783 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 17:16:17 compute-0 nova_compute[184779]: 2026-02-16 17:16:17.636 184783 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 16 17:16:17 compute-0 sudo[185218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unwzdexjemjqmxlgrggggbcqkgsjazss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262176.9867544-2540-279187913157649/AnsiballZ_copy.py'
Feb 16 17:16:17 compute-0 sudo[185218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:17 compute-0 nova_compute[184779]: 2026-02-16 17:16:17.825 184783 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:16:17 compute-0 nova_compute[184779]: 2026-02-16 17:16:17.839 184783 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:16:17 compute-0 nova_compute[184779]: 2026-02-16 17:16:17.839 184783 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 16 17:16:17 compute-0 python3.9[185220]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262176.9867544-2540-279187913157649/.source.yaml _original_basename=.wnzxbot1 follow=False checksum=bd57a4acea1b7fc06842b2dc70f7deb851812c3d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:17 compute-0 sudo[185218]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.364 184783 INFO nova.virt.driver [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.492 184783 INFO nova.compute.provider_config [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.505 184783 DEBUG oslo_concurrency.lockutils [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.506 184783 DEBUG oslo_concurrency.lockutils [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.506 184783 DEBUG oslo_concurrency.lockutils [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.506 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.507 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.507 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.507 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.507 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.507 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.507 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.507 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.507 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.508 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.508 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.508 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.508 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.508 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.508 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.508 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.509 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.509 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.509 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.509 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.509 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.509 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.509 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.509 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.510 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.510 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.510 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.510 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.510 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.510 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.510 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.511 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.511 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.511 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.511 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.511 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.511 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.511 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.512 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.512 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.512 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.512 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.512 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.512 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.513 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.513 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.513 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.513 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.513 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.513 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.513 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.513 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.514 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.514 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.514 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.514 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.514 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.514 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.514 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.515 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.515 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.515 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.515 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.515 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.515 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.515 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.515 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.516 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.516 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.516 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.516 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.516 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.516 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.516 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.516 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.517 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.517 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.517 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.517 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.517 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.517 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.517 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.518 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.518 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.518 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.518 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.518 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.518 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.518 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.519 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.519 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.519 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.519 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.519 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.519 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.519 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.519 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.520 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.520 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.520 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.520 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.520 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.520 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.520 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.520 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.521 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.521 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.521 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.521 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.521 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.521 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.521 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.522 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.522 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.522 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.522 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.522 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.522 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.522 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.522 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.523 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.523 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.523 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.523 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.523 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.523 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.523 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.523 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.524 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.524 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.524 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.524 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.524 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.524 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.524 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.525 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.525 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.525 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.525 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.525 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.525 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.525 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.525 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.526 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.526 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.526 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.526 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.526 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.526 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.526 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.527 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.527 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.527 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.527 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.527 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.527 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.527 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.528 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.528 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.528 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.528 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.528 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.528 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.528 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.528 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.529 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.529 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.529 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.529 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.529 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.529 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.529 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.530 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.530 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.530 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.530 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.530 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.530 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.530 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.531 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.531 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.531 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.531 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.531 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.531 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.531 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.531 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.532 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.532 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.532 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.532 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.532 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.532 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.532 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.533 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.533 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.533 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.533 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.533 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.533 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.533 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.533 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.534 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.534 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.534 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.534 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.534 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.534 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.534 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.535 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.535 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.535 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.535 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.535 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.535 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.535 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.536 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.536 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.536 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.536 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.536 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.536 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.536 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.536 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.537 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.537 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.537 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.537 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.537 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.537 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.537 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.537 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.538 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.538 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.538 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.538 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.538 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.538 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.538 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.539 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.539 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.539 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.539 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.539 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.539 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.539 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.540 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.540 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.540 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.540 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.540 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.540 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.540 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.540 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.541 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.541 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.541 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.541 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.541 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.541 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.541 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.542 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.542 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.542 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.542 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.542 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.542 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.542 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.543 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.543 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.543 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.543 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.543 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.543 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.543 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.543 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.544 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.544 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.544 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.544 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.544 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.544 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.544 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.545 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.545 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.545 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.545 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.545 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.545 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.545 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.545 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.546 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.546 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.546 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.546 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.546 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.546 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.546 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.547 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.547 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.547 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.547 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.547 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.547 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.547 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.548 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.548 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.548 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.548 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.548 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.548 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.548 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.548 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.549 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.549 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.549 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.549 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.549 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.549 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.549 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.549 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.550 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.550 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.550 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.550 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.550 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.550 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.550 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.551 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.551 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.551 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.551 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.551 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.551 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.551 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.552 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.552 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.552 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.552 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.552 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.552 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.552 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.552 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.553 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.553 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.553 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.553 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.553 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.553 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.553 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.554 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.554 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.554 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.554 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.554 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.554 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.555 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.555 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.555 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.555 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.555 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.555 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.555 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.556 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.556 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.556 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.556 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.556 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.556 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.556 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.556 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.557 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.557 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.557 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.557 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.557 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.557 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.557 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.557 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.558 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.558 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.558 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.558 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.558 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.558 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.558 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.559 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.559 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.559 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.559 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.559 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.559 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.559 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.560 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.560 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.560 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.560 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.560 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.560 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.560 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.560 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.561 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.561 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.561 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.561 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.561 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.561 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.561 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.562 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.562 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.562 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.562 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.562 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.562 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.562 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.562 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.563 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.563 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.563 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.563 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.563 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.563 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.563 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.564 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.564 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.564 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.564 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.564 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.564 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.564 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.564 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.565 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.565 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.565 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.565 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.565 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.565 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.565 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.565 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.566 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.566 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.566 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.566 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.566 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.566 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.566 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.567 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.567 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.567 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.567 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.567 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.567 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.567 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.568 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.568 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.568 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.568 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.568 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.568 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.568 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.568 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.569 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.569 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.569 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.569 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.569 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.569 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.569 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.570 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.570 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.570 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.570 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.570 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.570 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.570 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.570 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.571 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.571 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.571 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.571 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.571 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.571 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.571 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.572 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.572 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.572 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.572 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.572 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.572 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.572 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.572 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.573 184783 WARNING oslo_config.cfg [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 16 17:16:18 compute-0 nova_compute[184779]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 16 17:16:18 compute-0 nova_compute[184779]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 16 17:16:18 compute-0 nova_compute[184779]: and ``live_migration_inbound_addr`` respectively.
Feb 16 17:16:18 compute-0 nova_compute[184779]: ).  Its value may be silently ignored in the future.
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.573 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.573 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.573 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.573 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.573 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.574 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.574 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.574 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.574 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.574 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.574 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.574 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.575 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.575 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.575 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.575 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.575 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.575 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.575 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.576 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.576 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.576 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.576 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.576 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.576 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.576 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.577 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.577 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.577 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.577 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.577 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.577 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.577 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.578 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.578 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.578 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.578 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.578 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.578 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.578 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.579 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.579 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.579 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.579 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.579 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.579 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.579 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.580 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.580 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.580 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.580 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.580 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.580 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.580 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.580 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.581 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.581 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.581 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.581 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.581 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.581 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.581 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.582 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.582 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.582 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.582 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.582 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.582 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.582 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.582 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.583 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.583 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.583 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.583 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.583 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.583 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.584 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.584 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.584 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.584 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.584 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.584 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.584 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.585 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.585 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.585 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.585 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.585 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.585 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.585 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.586 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.586 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.586 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.586 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.586 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.586 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.586 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.587 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.587 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.587 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.587 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.587 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.587 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.587 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.587 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.588 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.588 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.588 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.588 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.588 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.588 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.588 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.588 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.589 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.589 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.589 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.589 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.589 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.589 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.589 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.590 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.590 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.590 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.590 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.590 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.590 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.590 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.590 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.591 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.591 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.591 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.591 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.591 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.591 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.592 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.592 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.592 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.592 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.592 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.592 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.593 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.593 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.593 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.593 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.593 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.593 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.593 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.594 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.594 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.594 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.594 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.594 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.594 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.594 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.595 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.595 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.595 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.595 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.595 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.595 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.595 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.595 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.596 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.596 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.596 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.596 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.596 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.596 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.596 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.597 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.597 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.597 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.597 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.597 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.597 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.597 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.597 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.598 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.598 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.598 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.598 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.598 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.598 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.599 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.599 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.599 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.599 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.599 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.599 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.599 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.600 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.600 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.600 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.600 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.600 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.600 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.600 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.601 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.601 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.601 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.601 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.601 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.601 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.601 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.602 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.602 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.602 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.602 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.602 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.602 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.602 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.603 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.603 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.603 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.603 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.603 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.603 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.603 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.604 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.604 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.604 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.604 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.604 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.604 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.604 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.605 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.605 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.605 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.605 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.605 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.605 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.605 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.605 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.606 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.606 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.606 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.606 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.606 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.606 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.606 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.607 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.607 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.607 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.607 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.607 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.607 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.607 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.608 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.608 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.608 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.608 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.608 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.609 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.609 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.609 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.609 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.609 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.609 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.609 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.609 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.610 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.610 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.610 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.610 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.610 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.610 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.611 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.611 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.611 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.611 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.611 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.611 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.611 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.612 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.612 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.612 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.612 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.612 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.612 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.612 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.613 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.613 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.613 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.613 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.613 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.613 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.613 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.614 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.614 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.614 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.614 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.614 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.614 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.614 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.615 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.615 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.615 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.615 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.615 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.615 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.615 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.616 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.616 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.616 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.616 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.616 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.616 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.616 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.617 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.617 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.617 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.617 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.617 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.617 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.617 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.618 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.618 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.618 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.618 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.618 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.618 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.618 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.619 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.619 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.619 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.619 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.619 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.619 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.619 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.620 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.620 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.620 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.620 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.620 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.620 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.620 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.621 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.621 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.621 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.621 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.621 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.621 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.622 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.622 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.622 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.622 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.622 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.622 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.622 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.623 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.623 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.623 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.623 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.623 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.623 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.623 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.623 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.624 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.624 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.624 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.624 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.624 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.624 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.624 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.625 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.625 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.625 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.625 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.625 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.625 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.625 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.626 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.626 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.626 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.626 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.626 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.626 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.626 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.627 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.627 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.627 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.627 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.627 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.627 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.627 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.628 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.628 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.628 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.628 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.628 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.628 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.628 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.628 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.629 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.629 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.629 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.629 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.629 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.629 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.629 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.630 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.630 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.630 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.630 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.630 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.630 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.630 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.631 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.631 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.631 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.631 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.631 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.631 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.631 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.632 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.632 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.632 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.632 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.632 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.632 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.632 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.633 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.633 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.633 184783 DEBUG oslo_service.service [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.634 184783 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.652 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.654 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.654 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.654 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 16 17:16:18 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 16 17:16:18 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.717 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fe15e6a4370> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.720 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fe15e6a4370> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.721 184783 INFO nova.virt.libvirt.driver [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Connection event '1' reason 'None'
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.733 184783 WARNING nova.virt.libvirt.driver [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 16 17:16:18 compute-0 nova_compute[184779]: 2026-02-16 17:16:18.734 184783 DEBUG nova.virt.libvirt.volume.mount [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 16 17:16:18 compute-0 podman[185346]: 2026-02-16 17:16:18.753039723 +0000 UTC m=+0.090810879 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:16:18 compute-0 python3.9[185400]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:16:19 compute-0 python3.9[185610]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.554 184783 INFO nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Libvirt host capabilities <capabilities>
Feb 16 17:16:19 compute-0 nova_compute[184779]: 
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <host>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <uuid>a72ae0da-02c0-4729-9eb8-f910b339152d</uuid>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <arch>x86_64</arch>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model>EPYC-Rome-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <vendor>AMD</vendor>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <microcode version='16777317'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <signature family='23' model='49' stepping='0'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='x2apic'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='tsc-deadline'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='osxsave'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='hypervisor'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='tsc_adjust'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='spec-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='stibp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='arch-capabilities'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='cmp_legacy'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='topoext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='virt-ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='lbrv'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='tsc-scale'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='vmcb-clean'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='pause-filter'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='pfthreshold'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='svme-addr-chk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='rdctl-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='skip-l1dfl-vmentry'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='mds-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature name='pschange-mc-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <pages unit='KiB' size='4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <pages unit='KiB' size='2048'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <pages unit='KiB' size='1048576'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <power_management>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <suspend_mem/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <suspend_disk/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <suspend_hybrid/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </power_management>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <iommu support='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <migration_features>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <live/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <uri_transports>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <uri_transport>tcp</uri_transport>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <uri_transport>rdma</uri_transport>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </uri_transports>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </migration_features>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <topology>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <cells num='1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <cell id='0'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:           <memory unit='KiB'>7864292</memory>
Feb 16 17:16:19 compute-0 nova_compute[184779]:           <pages unit='KiB' size='4'>1966073</pages>
Feb 16 17:16:19 compute-0 nova_compute[184779]:           <pages unit='KiB' size='2048'>0</pages>
Feb 16 17:16:19 compute-0 nova_compute[184779]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 16 17:16:19 compute-0 nova_compute[184779]:           <distances>
Feb 16 17:16:19 compute-0 nova_compute[184779]:             <sibling id='0' value='10'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:           </distances>
Feb 16 17:16:19 compute-0 nova_compute[184779]:           <cpus num='8'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:           </cpus>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         </cell>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </cells>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </topology>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <cache>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </cache>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <secmodel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model>selinux</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <doi>0</doi>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </secmodel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <secmodel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model>dac</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <doi>0</doi>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </secmodel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </host>
Feb 16 17:16:19 compute-0 nova_compute[184779]: 
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <guest>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <os_type>hvm</os_type>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <arch name='i686'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <wordsize>32</wordsize>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <domain type='qemu'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <domain type='kvm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </arch>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <features>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <pae/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <nonpae/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <acpi default='on' toggle='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <apic default='on' toggle='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <cpuselection/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <deviceboot/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <disksnapshot default='on' toggle='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <externalSnapshot/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </features>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </guest>
Feb 16 17:16:19 compute-0 nova_compute[184779]: 
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <guest>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <os_type>hvm</os_type>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <arch name='x86_64'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <wordsize>64</wordsize>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <domain type='qemu'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <domain type='kvm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </arch>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <features>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <acpi default='on' toggle='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <apic default='on' toggle='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <cpuselection/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <deviceboot/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <disksnapshot default='on' toggle='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <externalSnapshot/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </features>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </guest>
Feb 16 17:16:19 compute-0 nova_compute[184779]: 
Feb 16 17:16:19 compute-0 nova_compute[184779]: </capabilities>
Feb 16 17:16:19 compute-0 nova_compute[184779]: 
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.563 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.587 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 16 17:16:19 compute-0 nova_compute[184779]: <domainCapabilities>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <domain>kvm</domain>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <arch>i686</arch>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <vcpu max='4096'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <iothreads supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <os supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <enum name='firmware'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <loader supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>rom</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pflash</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='readonly'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>yes</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>no</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='secure'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>no</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </loader>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </os>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='host-passthrough' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='hostPassthroughMigratable'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>on</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>off</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='maximum' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='maximumMigratable'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>on</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>off</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='host-model' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <vendor>AMD</vendor>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='x2apic'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='hypervisor'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='stibp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='overflow-recov'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='succor'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='lbrv'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc-scale'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='flushbyasid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='pause-filter'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='pfthreshold'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='disable' name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='custom' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='ClearwaterForest'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ddpd-u'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sha512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='ClearwaterForest-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ddpd-u'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sha512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Dhyana-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Turin'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbpb'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Turin-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbpb'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-128'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-256'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-128'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-256'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v6'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v7'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='KnightsMill'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512er'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512pf'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='KnightsMill-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512er'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512pf'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G4-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tbm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G5-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tbm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='athlon'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='athlon-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='core2duo'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='core2duo-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='coreduo'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='coreduo-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='n270'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='n270-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='phenom'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='phenom-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <memoryBacking supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <enum name='sourceType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>file</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>anonymous</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>memfd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </memoryBacking>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <devices>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <disk supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='diskDevice'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>disk</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>cdrom</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>floppy</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>lun</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='bus'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>fdc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>scsi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>sata</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-non-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </disk>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <graphics supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vnc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>egl-headless</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dbus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </graphics>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <video supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='modelType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vga</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>cirrus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>none</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>bochs</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>ramfb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </video>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <hostdev supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='mode'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>subsystem</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='startupPolicy'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>default</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>mandatory</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>requisite</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>optional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='subsysType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pci</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>scsi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='capsType'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='pciBackend'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </hostdev>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <rng supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-non-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>random</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>egd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>builtin</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </rng>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <filesystem supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='driverType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>path</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>handle</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtiofs</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </filesystem>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <tpm supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tpm-tis</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tpm-crb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>emulator</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>external</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendVersion'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>2.0</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </tpm>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <redirdev supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='bus'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </redirdev>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <channel supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pty</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>unix</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </channel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <crypto supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>qemu</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>builtin</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </crypto>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <interface supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>default</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>passt</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </interface>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <panic supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>isa</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>hyperv</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </panic>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <console supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>null</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pty</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dev</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>file</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pipe</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>stdio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>udp</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tcp</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>unix</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>qemu-vdagent</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dbus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </console>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </devices>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <features>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <gic supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <vmcoreinfo supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <genid supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <backingStoreInput supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <backup supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <async-teardown supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <s390-pv supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <ps2 supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <tdx supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <sev supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <sgx supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <hyperv supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='features'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>relaxed</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vapic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>spinlocks</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vpindex</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>runtime</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>synic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>stimer</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>reset</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vendor_id</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>frequencies</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>reenlightenment</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tlbflush</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>ipi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>avic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>emsr_bitmap</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>xmm_input</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <defaults>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <spinlocks>4095</spinlocks>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <stimer_direct>on</stimer_direct>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </defaults>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </hyperv>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <launchSecurity supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </features>
Feb 16 17:16:19 compute-0 nova_compute[184779]: </domainCapabilities>
Feb 16 17:16:19 compute-0 nova_compute[184779]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.594 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 16 17:16:19 compute-0 nova_compute[184779]: <domainCapabilities>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <domain>kvm</domain>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <arch>i686</arch>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <vcpu max='240'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <iothreads supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <os supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <enum name='firmware'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <loader supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>rom</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pflash</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='readonly'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>yes</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>no</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='secure'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>no</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </loader>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </os>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='host-passthrough' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='hostPassthroughMigratable'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>on</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>off</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='maximum' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='maximumMigratable'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>on</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>off</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='host-model' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <vendor>AMD</vendor>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='x2apic'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='hypervisor'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='stibp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='overflow-recov'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='succor'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='lbrv'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc-scale'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='flushbyasid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='pause-filter'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='pfthreshold'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='disable' name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='custom' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='ClearwaterForest'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ddpd-u'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sha512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='ClearwaterForest-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ddpd-u'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sha512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Dhyana-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Turin'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbpb'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Turin-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbpb'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-128'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-256'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-128'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-256'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v6'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v7'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='KnightsMill'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512er'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512pf'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='KnightsMill-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512er'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512pf'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G4-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tbm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G5-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tbm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='athlon'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='athlon-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='core2duo'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='core2duo-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='coreduo'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='coreduo-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='n270'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='n270-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='phenom'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='phenom-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <memoryBacking supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <enum name='sourceType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>file</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>anonymous</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>memfd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </memoryBacking>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <devices>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <disk supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='diskDevice'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>disk</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>cdrom</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>floppy</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>lun</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='bus'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>ide</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>fdc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>scsi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>sata</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-non-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </disk>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <graphics supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vnc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>egl-headless</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dbus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </graphics>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <video supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='modelType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vga</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>cirrus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>none</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>bochs</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>ramfb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </video>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <hostdev supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='mode'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>subsystem</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='startupPolicy'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>default</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>mandatory</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>requisite</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>optional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='subsysType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pci</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>scsi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='capsType'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='pciBackend'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </hostdev>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <rng supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-non-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>random</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>egd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>builtin</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </rng>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <filesystem supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='driverType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>path</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>handle</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtiofs</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </filesystem>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <tpm supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tpm-tis</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tpm-crb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>emulator</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>external</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendVersion'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>2.0</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </tpm>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <redirdev supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='bus'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </redirdev>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <channel supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pty</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>unix</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </channel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <crypto supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>qemu</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>builtin</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </crypto>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <interface supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>default</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>passt</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </interface>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <panic supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>isa</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>hyperv</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </panic>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <console supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>null</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pty</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dev</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>file</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pipe</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>stdio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>udp</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tcp</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>unix</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>qemu-vdagent</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dbus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </console>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </devices>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <features>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <gic supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <vmcoreinfo supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <genid supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <backingStoreInput supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <backup supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <async-teardown supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <s390-pv supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <ps2 supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <tdx supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <sev supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <sgx supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <hyperv supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='features'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>relaxed</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vapic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>spinlocks</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vpindex</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>runtime</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>synic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>stimer</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>reset</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vendor_id</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>frequencies</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>reenlightenment</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tlbflush</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>ipi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>avic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>emsr_bitmap</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>xmm_input</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <defaults>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <spinlocks>4095</spinlocks>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <stimer_direct>on</stimer_direct>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </defaults>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </hyperv>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <launchSecurity supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </features>
Feb 16 17:16:19 compute-0 nova_compute[184779]: </domainCapabilities>
Feb 16 17:16:19 compute-0 nova_compute[184779]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.654 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.661 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 16 17:16:19 compute-0 nova_compute[184779]: <domainCapabilities>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <domain>kvm</domain>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <arch>x86_64</arch>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <vcpu max='4096'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <iothreads supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <os supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <enum name='firmware'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>efi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <loader supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>rom</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pflash</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='readonly'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>yes</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>no</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='secure'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>yes</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>no</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </loader>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </os>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='host-passthrough' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='hostPassthroughMigratable'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>on</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>off</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='maximum' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='maximumMigratable'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>on</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>off</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='host-model' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <vendor>AMD</vendor>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='x2apic'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='hypervisor'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='stibp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='overflow-recov'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='succor'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='lbrv'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc-scale'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='flushbyasid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='pause-filter'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='pfthreshold'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='disable' name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='custom' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='ClearwaterForest'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ddpd-u'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sha512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='ClearwaterForest-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ddpd-u'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sha512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Dhyana-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Turin'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbpb'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Turin-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbpb'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-128'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-256'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-128'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-256'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v6'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v7'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='KnightsMill'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512er'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512pf'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='KnightsMill-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512er'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512pf'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G4-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tbm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G5-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tbm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='athlon'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='athlon-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='core2duo'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='core2duo-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='coreduo'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='coreduo-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='n270'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='n270-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='phenom'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='phenom-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <memoryBacking supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <enum name='sourceType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>file</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>anonymous</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>memfd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </memoryBacking>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <devices>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <disk supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='diskDevice'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>disk</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>cdrom</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>floppy</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>lun</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='bus'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>fdc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>scsi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>sata</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-non-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </disk>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <graphics supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vnc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>egl-headless</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dbus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </graphics>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <video supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='modelType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vga</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>cirrus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>none</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>bochs</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>ramfb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </video>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <hostdev supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='mode'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>subsystem</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='startupPolicy'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>default</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>mandatory</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>requisite</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>optional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='subsysType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pci</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>scsi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='capsType'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='pciBackend'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </hostdev>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <rng supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-non-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>random</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>egd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>builtin</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </rng>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <filesystem supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='driverType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>path</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>handle</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtiofs</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </filesystem>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <tpm supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tpm-tis</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tpm-crb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>emulator</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>external</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendVersion'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>2.0</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </tpm>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <redirdev supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='bus'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </redirdev>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <channel supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pty</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>unix</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </channel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <crypto supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>qemu</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>builtin</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </crypto>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <interface supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>default</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>passt</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </interface>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <panic supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>isa</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>hyperv</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </panic>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <console supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>null</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pty</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dev</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>file</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pipe</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>stdio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>udp</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tcp</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>unix</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>qemu-vdagent</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dbus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </console>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </devices>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <features>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <gic supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <vmcoreinfo supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <genid supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <backingStoreInput supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <backup supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <async-teardown supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <s390-pv supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <ps2 supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <tdx supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <sev supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <sgx supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <hyperv supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='features'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>relaxed</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vapic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>spinlocks</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vpindex</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>runtime</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>synic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>stimer</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>reset</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vendor_id</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>frequencies</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>reenlightenment</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tlbflush</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>ipi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>avic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>emsr_bitmap</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>xmm_input</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <defaults>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <spinlocks>4095</spinlocks>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <stimer_direct>on</stimer_direct>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </defaults>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </hyperv>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <launchSecurity supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </features>
Feb 16 17:16:19 compute-0 nova_compute[184779]: </domainCapabilities>
Feb 16 17:16:19 compute-0 nova_compute[184779]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.744 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 16 17:16:19 compute-0 nova_compute[184779]: <domainCapabilities>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <domain>kvm</domain>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <arch>x86_64</arch>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <vcpu max='240'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <iothreads supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <os supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <enum name='firmware'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <loader supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>rom</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pflash</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='readonly'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>yes</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>no</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='secure'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>no</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </loader>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </os>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='host-passthrough' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='hostPassthroughMigratable'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>on</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>off</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='maximum' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='maximumMigratable'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>on</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>off</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='host-model' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <vendor>AMD</vendor>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='x2apic'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='hypervisor'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='stibp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='overflow-recov'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='succor'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='lbrv'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='tsc-scale'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='flushbyasid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='pause-filter'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='pfthreshold'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <feature policy='disable' name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <mode name='custom' supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Broadwell-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='ClearwaterForest'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ddpd-u'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sha512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='ClearwaterForest-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ddpd-u'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sha512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm3'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sm4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Cooperlake-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Denverton-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Dhyana-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Milan-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Rome-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Turin'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbpb'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-Turin-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amd-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='auto-ibrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='perfmon-v2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbpb'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='stibp-always-on'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='EPYC-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-128'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-256'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='GraniteRapids-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-128'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-256'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx10-512'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='prefetchiti'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Haswell-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v6'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Icelake-Server-v7'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='IvyBridge-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='KnightsMill'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512er'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512pf'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='KnightsMill-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512er'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512pf'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G4-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tbm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Opteron_G5-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fma4'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tbm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xop'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SapphireRapids-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='amx-tile'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-bf16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-fp16'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bitalg'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrc'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fzrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='la57'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='taa-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='SierraForest-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ifma'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cmpccxadd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fbsdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='fsrs'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ibrs-all'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='intel-psfd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='lam'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mcdt-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pbrsb-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='psdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='serialize'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vaes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Client-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='hle'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='rtm'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Skylake-Server-v5'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512bw'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512cd'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512dq'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512f'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='avx512vl'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='invpcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pcid'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='pku'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='mpx'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v2'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v3'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='core-capability'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='split-lock-detect'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='Snowridge-v4'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='cldemote'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='erms'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='gfni'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdir64b'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='movdiri'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='xsaves'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='athlon'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='athlon-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='core2duo'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='core2duo-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='coreduo'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='coreduo-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='n270'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='n270-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='ss'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='phenom'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <blockers model='phenom-v1'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnow'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <feature name='3dnowext'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </blockers>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </mode>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <memoryBacking supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <enum name='sourceType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>file</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>anonymous</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <value>memfd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </memoryBacking>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <devices>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <disk supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='diskDevice'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>disk</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>cdrom</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>floppy</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>lun</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='bus'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>ide</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>fdc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>scsi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>sata</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-non-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </disk>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <graphics supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vnc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>egl-headless</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dbus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </graphics>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <video supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='modelType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vga</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>cirrus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>none</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>bochs</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>ramfb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </video>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <hostdev supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='mode'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>subsystem</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='startupPolicy'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>default</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>mandatory</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>requisite</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>optional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='subsysType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pci</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>scsi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='capsType'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='pciBackend'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </hostdev>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <rng supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtio-non-transitional</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>random</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>egd</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>builtin</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </rng>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <filesystem supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='driverType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>path</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>handle</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>virtiofs</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </filesystem>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <tpm supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tpm-tis</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tpm-crb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>emulator</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>external</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendVersion'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>2.0</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </tpm>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <redirdev supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='bus'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>usb</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </redirdev>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <channel supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pty</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>unix</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </channel>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <crypto supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>qemu</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendModel'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>builtin</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </crypto>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <interface supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='backendType'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>default</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>passt</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </interface>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <panic supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='model'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>isa</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>hyperv</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </panic>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <console supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='type'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>null</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vc</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pty</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dev</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>file</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>pipe</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>stdio</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>udp</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tcp</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>unix</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>qemu-vdagent</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>dbus</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </console>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </devices>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <features>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <gic supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <vmcoreinfo supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <genid supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <backingStoreInput supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <backup supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <async-teardown supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <s390-pv supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <ps2 supported='yes'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <tdx supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <sev supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <sgx supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <hyperv supported='yes'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <enum name='features'>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>relaxed</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vapic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>spinlocks</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vpindex</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>runtime</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>synic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>stimer</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>reset</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>vendor_id</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>frequencies</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>reenlightenment</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>tlbflush</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>ipi</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>avic</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>emsr_bitmap</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <value>xmm_input</value>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </enum>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       <defaults>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <spinlocks>4095</spinlocks>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <stimer_direct>on</stimer_direct>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 17:16:19 compute-0 nova_compute[184779]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 17:16:19 compute-0 nova_compute[184779]:       </defaults>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     </hyperv>
Feb 16 17:16:19 compute-0 nova_compute[184779]:     <launchSecurity supported='no'/>
Feb 16 17:16:19 compute-0 nova_compute[184779]:   </features>
Feb 16 17:16:19 compute-0 nova_compute[184779]: </domainCapabilities>
Feb 16 17:16:19 compute-0 nova_compute[184779]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.812 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.813 184783 INFO nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Secure Boot support detected
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.815 184783 INFO nova.virt.libvirt.driver [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.815 184783 INFO nova.virt.libvirt.driver [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.824 184783 DEBUG nova.virt.libvirt.driver [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] cpu compare xml: <cpu match="exact">
Feb 16 17:16:19 compute-0 nova_compute[184779]:   <model>Nehalem</model>
Feb 16 17:16:19 compute-0 nova_compute[184779]: </cpu>
Feb 16 17:16:19 compute-0 nova_compute[184779]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.829 184783 DEBUG nova.virt.libvirt.driver [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.856 184783 INFO nova.virt.node [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Determined node identity bb904aac-529f-46ef-9861-9c655a4b383c from /var/lib/nova/compute_id
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.878 184783 WARNING nova.compute.manager [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Compute nodes ['bb904aac-529f-46ef-9861-9c655a4b383c'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.910 184783 INFO nova.compute.manager [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.958 184783 WARNING nova.compute.manager [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.958 184783 DEBUG oslo_concurrency.lockutils [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.958 184783 DEBUG oslo_concurrency.lockutils [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.959 184783 DEBUG oslo_concurrency.lockutils [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:16:19 compute-0 nova_compute[184779]: 2026-02-16 17:16:19.959 184783 DEBUG nova.compute.resource_tracker [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:16:19 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 16 17:16:20 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 16 17:16:20 compute-0 python3.9[185785]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.227 184783 WARNING nova.virt.libvirt.driver [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.228 184783 DEBUG nova.compute.resource_tracker [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6166MB free_disk=73.43958282470703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.229 184783 DEBUG oslo_concurrency.lockutils [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.229 184783 DEBUG oslo_concurrency.lockutils [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.240 184783 WARNING nova.compute.resource_tracker [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] No compute node record for compute-0.ctlplane.example.com:bb904aac-529f-46ef-9861-9c655a4b383c: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host bb904aac-529f-46ef-9861-9c655a4b383c could not be found.
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.253 184783 INFO nova.compute.resource_tracker [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: bb904aac-529f-46ef-9861-9c655a4b383c
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.308 184783 DEBUG nova.compute.resource_tracker [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.308 184783 DEBUG nova.compute.resource_tracker [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.740 184783 INFO nova.scheduler.client.report [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] [req-beef0393-da75-433a-93b5-0bbdef31b280] Created resource provider record via placement API for resource provider with UUID bb904aac-529f-46ef-9861-9c655a4b383c and name compute-0.ctlplane.example.com.
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.764 184783 DEBUG nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 16 17:16:20 compute-0 nova_compute[184779]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.764 184783 INFO nova.virt.libvirt.host [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] kernel doesn't support AMD SEV
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.765 184783 DEBUG nova.compute.provider_tree [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.766 184783 DEBUG nova.virt.libvirt.driver [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.770 184783 DEBUG nova.virt.libvirt.driver [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Libvirt baseline CPU <cpu>
Feb 16 17:16:20 compute-0 nova_compute[184779]:   <arch>x86_64</arch>
Feb 16 17:16:20 compute-0 nova_compute[184779]:   <model>Nehalem</model>
Feb 16 17:16:20 compute-0 nova_compute[184779]:   <vendor>AMD</vendor>
Feb 16 17:16:20 compute-0 nova_compute[184779]:   <topology sockets="8" cores="1" threads="1"/>
Feb 16 17:16:20 compute-0 nova_compute[184779]: </cpu>
Feb 16 17:16:20 compute-0 nova_compute[184779]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.829 184783 DEBUG nova.scheduler.client.report [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Updated inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.830 184783 DEBUG nova.compute.provider_tree [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Updating resource provider bb904aac-529f-46ef-9861-9c655a4b383c generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 17:16:20 compute-0 nova_compute[184779]: 2026-02-16 17:16:20.830 184783 DEBUG nova.compute.provider_tree [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:16:21 compute-0 sudo[185937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtzxbfalswirmvognqlkrfbuqmkpraca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262180.46171-2640-131791291622806/AnsiballZ_podman_container.py'
Feb 16 17:16:21 compute-0 sudo[185937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:21 compute-0 nova_compute[184779]: 2026-02-16 17:16:21.281 184783 DEBUG nova.compute.provider_tree [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Updating resource provider bb904aac-529f-46ef-9861-9c655a4b383c generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 17:16:21 compute-0 python3.9[185939]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 16 17:16:21 compute-0 nova_compute[184779]: 2026-02-16 17:16:21.316 184783 DEBUG nova.compute.resource_tracker [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:16:21 compute-0 nova_compute[184779]: 2026-02-16 17:16:21.317 184783 DEBUG oslo_concurrency.lockutils [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:16:21 compute-0 rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 17:16:21 compute-0 nova_compute[184779]: 2026-02-16 17:16:21.318 184783 DEBUG nova.service [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 16 17:16:21 compute-0 rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 17:16:21 compute-0 nova_compute[184779]: 2026-02-16 17:16:21.384 184783 DEBUG nova.service [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 16 17:16:21 compute-0 nova_compute[184779]: 2026-02-16 17:16:21.385 184783 DEBUG nova.servicegroup.drivers.db [None req-61acbdc6-3102-4778-b828-c41658862240 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 16 17:16:21 compute-0 sudo[185937]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:22 compute-0 sudo[186111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgpjucyqiyppelmlcksiwxkiojldcnnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262182.4769454-2656-19912607790951/AnsiballZ_systemd.py'
Feb 16 17:16:22 compute-0 sudo[186111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:23 compute-0 python3.9[186113]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 16 17:16:23 compute-0 systemd[1]: Stopping nova_compute container...
Feb 16 17:16:23 compute-0 nova_compute[184779]: 2026-02-16 17:16:23.452 184783 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 16 17:16:23 compute-0 nova_compute[184779]: 2026-02-16 17:16:23.455 184783 DEBUG oslo_concurrency.lockutils [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:16:23 compute-0 nova_compute[184779]: 2026-02-16 17:16:23.456 184783 DEBUG oslo_concurrency.lockutils [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:16:23 compute-0 nova_compute[184779]: 2026-02-16 17:16:23.456 184783 DEBUG oslo_concurrency.lockutils [None req-e0aee380-4bcb-485b-9485-9e69f9aea432 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:16:23 compute-0 virtqemud[185389]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 16 17:16:23 compute-0 virtqemud[185389]: hostname: compute-0
Feb 16 17:16:23 compute-0 virtqemud[185389]: End of file while reading data: Input/output error
Feb 16 17:16:23 compute-0 systemd[1]: libpod-14cb986004037f98982308929308e47e5959fca75770a482149a3f55897e552b.scope: Deactivated successfully.
Feb 16 17:16:23 compute-0 systemd[1]: libpod-14cb986004037f98982308929308e47e5959fca75770a482149a3f55897e552b.scope: Consumed 3.378s CPU time.
Feb 16 17:16:23 compute-0 podman[186117]: 2026-02-16 17:16:23.953388151 +0000 UTC m=+0.832101589 container died 14cb986004037f98982308929308e47e5959fca75770a482149a3f55897e552b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 16 17:16:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-14cb986004037f98982308929308e47e5959fca75770a482149a3f55897e552b-userdata-shm.mount: Deactivated successfully.
Feb 16 17:16:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f-merged.mount: Deactivated successfully.
Feb 16 17:16:24 compute-0 podman[186117]: 2026-02-16 17:16:24.006756972 +0000 UTC m=+0.885470400 container cleanup 14cb986004037f98982308929308e47e5959fca75770a482149a3f55897e552b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 17:16:24 compute-0 podman[186117]: nova_compute
Feb 16 17:16:24 compute-0 podman[186147]: nova_compute
Feb 16 17:16:24 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 16 17:16:24 compute-0 systemd[1]: Stopped nova_compute container.
Feb 16 17:16:24 compute-0 systemd[1]: Starting nova_compute container...
Feb 16 17:16:24 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:16:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b361c4056db2caae3f039b441bf71654a0ba1cbf92f17ba678197a47791d67f/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:24 compute-0 podman[186160]: 2026-02-16 17:16:24.182011801 +0000 UTC m=+0.076928206 container init 14cb986004037f98982308929308e47e5959fca75770a482149a3f55897e552b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, config_id=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 17:16:24 compute-0 podman[186160]: 2026-02-16 17:16:24.191553477 +0000 UTC m=+0.086469832 container start 14cb986004037f98982308929308e47e5959fca75770a482149a3f55897e552b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Feb 16 17:16:24 compute-0 podman[186160]: nova_compute
Feb 16 17:16:24 compute-0 systemd[1]: Started nova_compute container.
Feb 16 17:16:24 compute-0 nova_compute[186176]: + sudo -E kolla_set_configs
Feb 16 17:16:24 compute-0 sudo[186111]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Validating config file
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Copying service configuration files
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Deleting /etc/ceph
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Creating directory /etc/ceph
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /etc/ceph
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Writing out command to execute
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 16 17:16:24 compute-0 nova_compute[186176]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 16 17:16:24 compute-0 nova_compute[186176]: ++ cat /run_command
Feb 16 17:16:24 compute-0 nova_compute[186176]: + CMD=nova-compute
Feb 16 17:16:24 compute-0 nova_compute[186176]: + ARGS=
Feb 16 17:16:24 compute-0 nova_compute[186176]: + sudo kolla_copy_cacerts
Feb 16 17:16:24 compute-0 nova_compute[186176]: + [[ ! -n '' ]]
Feb 16 17:16:24 compute-0 nova_compute[186176]: + . kolla_extend_start
Feb 16 17:16:24 compute-0 nova_compute[186176]: Running command: 'nova-compute'
Feb 16 17:16:24 compute-0 nova_compute[186176]: + echo 'Running command: '\''nova-compute'\'''
Feb 16 17:16:24 compute-0 nova_compute[186176]: + umask 0022
Feb 16 17:16:24 compute-0 nova_compute[186176]: + exec nova-compute
Feb 16 17:16:24 compute-0 sudo[186337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-immvvltsaxkkubcrbrndqmvjnjnnbuby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262184.4132943-2674-74801222446859/AnsiballZ_podman_container.py'
Feb 16 17:16:24 compute-0 sudo[186337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:24 compute-0 python3.9[186339]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 16 17:16:25 compute-0 systemd[1]: Started libpod-conmon-2c036256c52622a206c9db1c170805517a68d3619b6ccf11395393261d9c0fdb.scope.
Feb 16 17:16:25 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:16:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a74e3fa5a580d8c13ad58a159ffe8b8e9d79f7c15fd5eb31a0299a9a22e6b5f1/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a74e3fa5a580d8c13ad58a159ffe8b8e9d79f7c15fd5eb31a0299a9a22e6b5f1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:25 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a74e3fa5a580d8c13ad58a159ffe8b8e9d79f7c15fd5eb31a0299a9a22e6b5f1/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 16 17:16:25 compute-0 podman[186366]: 2026-02-16 17:16:25.162660138 +0000 UTC m=+0.139913245 container init 2c036256c52622a206c9db1c170805517a68d3619b6ccf11395393261d9c0fdb (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.build-date=20260127, config_id=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:16:25 compute-0 podman[186366]: 2026-02-16 17:16:25.170499432 +0000 UTC m=+0.147752539 container start 2c036256c52622a206c9db1c170805517a68d3619b6ccf11395393261d9c0fdb (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 17:16:25 compute-0 python3.9[186339]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Applying nova statedir ownership
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 16 17:16:25 compute-0 nova_compute_init[186388]: INFO:nova_statedir:Nova statedir ownership complete
Feb 16 17:16:25 compute-0 systemd[1]: libpod-2c036256c52622a206c9db1c170805517a68d3619b6ccf11395393261d9c0fdb.scope: Deactivated successfully.
Feb 16 17:16:25 compute-0 podman[186408]: 2026-02-16 17:16:25.28273096 +0000 UTC m=+0.032958947 container died 2c036256c52622a206c9db1c170805517a68d3619b6ccf11395393261d9c0fdb (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 17:16:25 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2c036256c52622a206c9db1c170805517a68d3619b6ccf11395393261d9c0fdb-userdata-shm.mount: Deactivated successfully.
Feb 16 17:16:25 compute-0 systemd[1]: var-lib-containers-storage-overlay-a74e3fa5a580d8c13ad58a159ffe8b8e9d79f7c15fd5eb31a0299a9a22e6b5f1-merged.mount: Deactivated successfully.
Feb 16 17:16:25 compute-0 podman[186408]: 2026-02-16 17:16:25.305516504 +0000 UTC m=+0.055744481 container cleanup 2c036256c52622a206c9db1c170805517a68d3619b6ccf11395393261d9c0fdb (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '061196b006e9df4f09453dc1952139056318917ffb037c9557c67e4d5224d409'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, config_id=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 17:16:25 compute-0 sudo[186337]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:25 compute-0 systemd[1]: libpod-conmon-2c036256c52622a206c9db1c170805517a68d3619b6ccf11395393261d9c0fdb.scope: Deactivated successfully.
Feb 16 17:16:25 compute-0 sshd-session[161251]: Connection closed by 192.168.122.30 port 45152
Feb 16 17:16:25 compute-0 sshd-session[161248]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:16:25 compute-0 systemd-logind[821]: Session 24 logged out. Waiting for processes to exit.
Feb 16 17:16:25 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Feb 16 17:16:25 compute-0 systemd[1]: session-24.scope: Consumed 1min 37.372s CPU time.
Feb 16 17:16:25 compute-0 systemd-logind[821]: Removed session 24.
Feb 16 17:16:26 compute-0 nova_compute[186176]: 2026-02-16 17:16:26.136 186180 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 17:16:26 compute-0 nova_compute[186176]: 2026-02-16 17:16:26.136 186180 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 17:16:26 compute-0 nova_compute[186176]: 2026-02-16 17:16:26.136 186180 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 16 17:16:26 compute-0 nova_compute[186176]: 2026-02-16 17:16:26.136 186180 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 16 17:16:26 compute-0 nova_compute[186176]: 2026-02-16 17:16:26.284 186180 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:16:26 compute-0 nova_compute[186176]: 2026-02-16 17:16:26.306 186180 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:16:26 compute-0 nova_compute[186176]: 2026-02-16 17:16:26.307 186180 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 16 17:16:26 compute-0 nova_compute[186176]: 2026-02-16 17:16:26.933 186180 INFO nova.virt.driver [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.050 186180 INFO nova.compute.provider_config [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.060 186180 DEBUG oslo_concurrency.lockutils [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.060 186180 DEBUG oslo_concurrency.lockutils [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.060 186180 DEBUG oslo_concurrency.lockutils [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.061 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.061 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.061 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.061 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.061 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.062 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.062 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.062 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.062 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.062 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.062 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.063 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.063 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.063 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.063 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.063 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.063 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.063 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.064 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.064 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.064 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.064 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.064 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.064 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.064 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.065 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.065 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.065 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.065 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.065 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.065 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.065 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.066 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.066 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.066 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.066 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.066 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.066 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.067 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.067 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.067 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.067 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.067 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.068 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.068 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.068 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.068 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.068 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.068 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.069 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.069 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.069 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.069 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.069 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.069 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.070 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.070 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.070 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.070 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.070 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.071 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.071 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.071 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.071 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.071 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.071 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.072 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.072 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.072 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.072 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.072 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.072 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.073 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.073 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.073 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.073 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.073 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.074 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.074 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.074 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.074 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.074 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.074 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.074 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.075 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.075 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.075 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.075 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.075 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.075 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.075 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.076 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.076 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.076 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.076 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.076 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.076 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.076 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.077 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.077 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.077 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.077 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.077 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.077 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.078 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.078 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.078 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.078 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.078 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.079 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.079 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.079 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.079 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.079 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.079 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.079 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.080 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.080 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.080 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.080 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.080 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.080 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.080 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.081 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.081 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.081 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.081 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.081 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.081 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.081 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.082 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.082 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.082 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.082 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.082 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.082 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.083 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.083 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.083 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.083 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.083 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.084 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.084 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.084 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.084 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.084 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.085 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.085 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.085 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.085 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.085 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.085 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.086 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.086 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.086 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.086 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.086 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.087 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.087 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.087 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.087 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.087 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.088 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.088 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.088 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.088 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.088 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.089 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.089 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.089 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.089 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.089 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.090 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.090 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.090 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.090 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.091 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.091 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.091 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.091 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.091 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.092 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.092 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.092 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.092 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.092 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.093 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.093 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.093 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.093 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.094 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.094 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.094 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.094 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.094 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.094 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.095 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.095 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.095 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.095 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.095 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.096 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.096 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.096 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.096 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.096 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.096 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.097 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.097 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.097 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.097 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.097 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.098 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.098 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.098 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.098 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.098 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.099 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.099 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.099 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.099 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.099 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.099 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.100 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.100 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.100 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.100 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.100 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.101 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.101 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.101 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.101 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.101 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.101 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.102 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.102 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.102 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.102 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.102 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.103 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.103 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.103 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.103 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.103 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.103 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.104 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.104 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.104 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.104 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.104 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.104 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.105 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.105 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.105 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.105 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.105 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.106 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.106 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.106 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.106 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.106 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.106 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.107 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.107 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.107 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.107 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.107 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.108 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.108 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.108 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.108 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.108 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.108 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.108 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.109 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.109 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.109 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.109 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.109 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.109 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.109 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.110 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.111 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.111 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.111 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.111 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.111 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.111 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.111 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.111 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.112 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.112 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.112 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.112 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.112 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.112 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.112 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.113 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.113 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.113 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.113 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.113 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.113 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.113 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.114 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.114 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.114 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.114 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.114 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.114 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.115 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.115 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.115 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.115 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.116 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.116 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.116 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.116 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.116 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.116 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.116 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.117 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.117 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.117 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.117 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.117 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.117 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.117 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.118 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.118 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.118 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.118 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.118 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.118 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.118 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.119 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.119 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.119 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.119 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.119 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.119 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.119 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.120 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.120 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.120 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.120 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.120 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.120 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.121 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.121 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.121 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.121 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.121 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.121 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.121 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.122 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.122 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.122 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.122 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.122 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.122 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.123 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.123 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.123 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.123 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.123 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.123 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.123 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.124 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.124 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.124 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.124 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.124 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.124 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.124 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.125 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.125 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.125 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.125 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.125 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.126 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.126 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.126 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.126 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.126 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.126 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.127 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.127 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.127 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.127 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.127 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.128 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.128 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.128 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.128 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.128 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.129 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.129 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.129 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.129 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.129 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.130 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.130 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.130 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.130 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.131 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.131 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.131 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.131 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.131 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.131 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.132 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.132 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.132 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.132 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.132 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.133 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.133 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.133 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.133 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.133 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.134 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.134 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.134 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.134 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.134 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.135 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.135 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.135 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.135 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.135 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.135 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.136 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.136 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.136 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.136 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.136 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.136 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.137 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.137 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.137 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.137 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.137 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.138 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.138 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.138 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.138 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.138 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.139 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.139 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.139 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.139 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.139 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.140 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.140 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.140 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.140 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.140 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.140 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.141 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.141 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.141 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.141 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.141 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.142 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.142 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.142 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.142 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.142 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.143 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.143 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.143 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.143 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.143 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.144 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.144 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.144 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.144 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.144 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.145 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.145 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.145 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.145 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.145 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.146 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.146 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.146 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.146 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.147 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.147 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.147 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.147 186180 WARNING oslo_config.cfg [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 16 17:16:27 compute-0 nova_compute[186176]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 16 17:16:27 compute-0 nova_compute[186176]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 16 17:16:27 compute-0 nova_compute[186176]: and ``live_migration_inbound_addr`` respectively.
Feb 16 17:16:27 compute-0 nova_compute[186176]: ).  Its value may be silently ignored in the future.
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.148 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.148 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.148 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.148 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.148 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.149 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.149 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.149 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.149 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.149 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.149 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.150 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.150 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.150 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.150 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.150 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.151 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.151 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.151 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.151 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.151 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.151 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.152 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.152 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.152 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.152 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.152 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.152 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.152 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.153 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.153 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.153 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.153 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.153 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.153 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.154 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.154 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.154 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.154 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.154 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.155 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.155 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.155 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.155 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.155 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.156 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.156 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.156 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.156 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.156 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.156 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.156 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.157 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.157 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.157 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.157 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.157 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.157 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.158 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.158 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.158 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.158 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.158 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.158 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.159 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.159 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.159 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.159 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.159 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.159 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.159 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.160 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.160 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.160 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.160 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.160 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.160 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.160 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.161 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.161 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.161 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.161 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.161 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.161 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.161 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.162 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.162 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.162 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.162 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.162 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.162 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.163 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.163 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.163 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.163 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.163 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.163 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.163 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.163 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.164 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.164 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.164 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.164 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.164 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.164 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.164 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.165 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.165 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.165 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.165 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.165 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.165 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.166 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.166 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.166 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.166 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.166 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.166 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.166 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.167 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.167 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.167 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.167 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.167 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.167 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.167 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.168 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.168 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.168 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.168 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.168 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.168 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.168 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.168 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.169 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.169 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.169 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.169 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.169 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.170 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.170 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.170 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.170 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.170 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.170 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.170 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.171 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.171 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.171 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.171 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.171 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.171 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.171 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.172 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.172 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.172 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.172 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.172 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.172 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.172 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.173 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.173 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.173 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.173 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.173 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.173 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.174 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.174 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.174 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.174 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.174 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.174 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.174 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.175 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.175 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.175 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.175 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.175 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.175 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.176 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.176 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.176 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.176 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.176 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.176 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.176 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.177 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.177 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.177 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.177 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.177 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.177 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.177 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.178 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.178 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.178 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.178 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.178 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.178 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.179 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.179 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.179 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.179 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.179 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.179 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.180 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.180 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.180 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.180 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.180 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.180 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.180 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.181 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.181 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.181 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.181 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.181 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.181 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.182 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.182 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.182 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.182 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.182 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.182 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.182 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.182 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.183 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.183 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.183 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.183 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.183 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.183 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.183 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.184 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.184 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.184 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.184 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.184 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.184 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.185 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.185 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.185 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.185 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.185 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.185 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.186 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.186 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.186 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.186 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.187 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.187 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.187 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.187 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.187 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.187 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.187 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.188 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.188 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.188 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.188 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.188 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.188 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.188 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.189 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.189 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.189 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.189 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.189 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.189 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.189 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.190 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.190 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.190 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.190 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.190 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.190 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.190 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.191 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.191 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.191 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.191 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.191 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.191 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.191 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.192 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.192 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.192 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.192 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.192 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.192 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.193 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.193 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.193 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.193 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.193 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.194 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.194 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.194 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.194 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.194 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.194 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.195 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.195 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.195 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.195 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.195 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.195 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.196 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.196 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.196 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.196 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.196 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.196 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.196 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.197 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.197 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.197 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.197 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.197 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.197 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.198 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.198 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.198 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.198 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.198 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.198 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.198 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.199 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.199 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.199 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.199 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.199 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.199 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.200 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.200 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.200 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.200 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.200 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.201 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.201 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.201 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.201 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.201 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.201 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.201 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.202 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.202 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.202 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.202 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.202 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.202 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.202 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.203 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.203 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.203 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.203 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.203 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.203 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.203 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.204 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.204 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.204 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.204 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.204 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.204 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.204 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.204 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.205 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.205 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.205 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.205 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.205 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.205 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.205 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.206 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.206 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.206 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.206 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.206 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.207 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.207 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.207 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.207 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.207 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.207 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.207 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.208 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.208 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.208 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.208 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.208 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.208 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.208 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.209 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.209 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.209 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.209 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.209 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.209 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.209 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.210 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.210 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.210 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.210 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.210 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.210 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.210 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.211 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.211 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.211 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.211 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.211 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.211 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.211 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.211 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.212 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.212 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.212 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.212 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.212 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.212 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.212 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.213 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.213 186180 DEBUG oslo_service.service [None req-81b2b77c-b384-4a45-8d8c-22f6c41233ef - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.214 186180 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.230 186180 INFO nova.virt.node [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Determined node identity bb904aac-529f-46ef-9861-9c655a4b383c from /var/lib/nova/compute_id
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.230 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.231 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.231 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.232 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.246 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fc2ada00a60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.248 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fc2ada00a60> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.249 186180 INFO nova.virt.libvirt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Connection event '1' reason 'None'
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.254 186180 INFO nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Libvirt host capabilities <capabilities>
Feb 16 17:16:27 compute-0 nova_compute[186176]: 
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <host>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <uuid>a72ae0da-02c0-4729-9eb8-f910b339152d</uuid>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <arch>x86_64</arch>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model>EPYC-Rome-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <vendor>AMD</vendor>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <microcode version='16777317'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <signature family='23' model='49' stepping='0'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='x2apic'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='tsc-deadline'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='osxsave'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='hypervisor'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='tsc_adjust'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='spec-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='stibp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='arch-capabilities'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='cmp_legacy'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='topoext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='virt-ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='lbrv'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='tsc-scale'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='vmcb-clean'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='pause-filter'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='pfthreshold'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='svme-addr-chk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='rdctl-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='skip-l1dfl-vmentry'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='mds-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature name='pschange-mc-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <pages unit='KiB' size='4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <pages unit='KiB' size='2048'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <pages unit='KiB' size='1048576'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <power_management>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <suspend_mem/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <suspend_disk/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <suspend_hybrid/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </power_management>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <iommu support='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <migration_features>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <live/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <uri_transports>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <uri_transport>tcp</uri_transport>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <uri_transport>rdma</uri_transport>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </uri_transports>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </migration_features>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <topology>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <cells num='1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <cell id='0'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:           <memory unit='KiB'>7864292</memory>
Feb 16 17:16:27 compute-0 nova_compute[186176]:           <pages unit='KiB' size='4'>1966073</pages>
Feb 16 17:16:27 compute-0 nova_compute[186176]:           <pages unit='KiB' size='2048'>0</pages>
Feb 16 17:16:27 compute-0 nova_compute[186176]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 16 17:16:27 compute-0 nova_compute[186176]:           <distances>
Feb 16 17:16:27 compute-0 nova_compute[186176]:             <sibling id='0' value='10'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:           </distances>
Feb 16 17:16:27 compute-0 nova_compute[186176]:           <cpus num='8'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:           </cpus>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         </cell>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </cells>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </topology>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <cache>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </cache>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <secmodel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model>selinux</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <doi>0</doi>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </secmodel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <secmodel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model>dac</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <doi>0</doi>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </secmodel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </host>
Feb 16 17:16:27 compute-0 nova_compute[186176]: 
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <guest>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <os_type>hvm</os_type>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <arch name='i686'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <wordsize>32</wordsize>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <domain type='qemu'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <domain type='kvm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </arch>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <features>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <pae/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <nonpae/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <acpi default='on' toggle='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <apic default='on' toggle='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <cpuselection/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <deviceboot/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <disksnapshot default='on' toggle='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <externalSnapshot/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </features>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </guest>
Feb 16 17:16:27 compute-0 nova_compute[186176]: 
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <guest>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <os_type>hvm</os_type>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <arch name='x86_64'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <wordsize>64</wordsize>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <domain type='qemu'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <domain type='kvm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </arch>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <features>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <acpi default='on' toggle='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <apic default='on' toggle='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <cpuselection/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <deviceboot/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <disksnapshot default='on' toggle='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <externalSnapshot/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </features>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </guest>
Feb 16 17:16:27 compute-0 nova_compute[186176]: 
Feb 16 17:16:27 compute-0 nova_compute[186176]: </capabilities>
Feb 16 17:16:27 compute-0 nova_compute[186176]: 
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.265 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.270 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 16 17:16:27 compute-0 nova_compute[186176]: <domainCapabilities>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <domain>kvm</domain>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <arch>i686</arch>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <vcpu max='240'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <iothreads supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <os supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <enum name='firmware'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <loader supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>rom</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pflash</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='readonly'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>yes</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>no</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='secure'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>no</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </loader>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </os>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='host-passthrough' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='hostPassthroughMigratable'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>on</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>off</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='maximum' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='maximumMigratable'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>on</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>off</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='host-model' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <vendor>AMD</vendor>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='x2apic'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='hypervisor'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='stibp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='overflow-recov'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='succor'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='lbrv'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc-scale'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='flushbyasid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='pause-filter'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='pfthreshold'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='disable' name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='custom' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='ClearwaterForest'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ddpd-u'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sha512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='ClearwaterForest-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ddpd-u'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sha512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Dhyana-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Turin'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbpb'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Turin-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbpb'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-128'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-256'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-128'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-256'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v6'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v7'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='KnightsMill'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512er'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512pf'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='KnightsMill-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512er'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512pf'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G4-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tbm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G5-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tbm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='athlon'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='athlon-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='core2duo'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='core2duo-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='coreduo'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='coreduo-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='n270'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='n270-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='phenom'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='phenom-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <memoryBacking supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <enum name='sourceType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>file</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>anonymous</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>memfd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </memoryBacking>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <disk supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='diskDevice'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>disk</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>cdrom</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>floppy</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>lun</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='bus'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>ide</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>fdc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>scsi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>sata</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-non-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <graphics supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vnc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>egl-headless</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dbus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </graphics>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <video supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='modelType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vga</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>cirrus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>none</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>bochs</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>ramfb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </video>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <hostdev supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='mode'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>subsystem</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='startupPolicy'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>default</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>mandatory</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>requisite</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>optional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='subsysType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pci</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>scsi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='capsType'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='pciBackend'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </hostdev>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <rng supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-non-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>random</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>egd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>builtin</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <filesystem supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='driverType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>path</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>handle</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtiofs</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </filesystem>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <tpm supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tpm-tis</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tpm-crb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>emulator</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>external</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendVersion'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>2.0</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </tpm>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <redirdev supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='bus'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </redirdev>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <channel supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pty</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>unix</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </channel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <crypto supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>qemu</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>builtin</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </crypto>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <interface supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>default</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>passt</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <panic supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>isa</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>hyperv</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </panic>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <console supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>null</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pty</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dev</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>file</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pipe</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>stdio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>udp</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tcp</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>unix</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>qemu-vdagent</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dbus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </console>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <features>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <gic supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <vmcoreinfo supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <genid supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <backingStoreInput supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <backup supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <async-teardown supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <s390-pv supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <ps2 supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <tdx supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <sev supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <sgx supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <hyperv supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='features'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>relaxed</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vapic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>spinlocks</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vpindex</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>runtime</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>synic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>stimer</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>reset</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vendor_id</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>frequencies</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>reenlightenment</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tlbflush</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>ipi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>avic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>emsr_bitmap</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>xmm_input</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <defaults>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <spinlocks>4095</spinlocks>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <stimer_direct>on</stimer_direct>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </defaults>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </hyperv>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <launchSecurity supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </features>
Feb 16 17:16:27 compute-0 nova_compute[186176]: </domainCapabilities>
Feb 16 17:16:27 compute-0 nova_compute[186176]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.276 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 16 17:16:27 compute-0 nova_compute[186176]: <domainCapabilities>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <domain>kvm</domain>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <arch>i686</arch>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <vcpu max='4096'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <iothreads supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <os supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <enum name='firmware'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <loader supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>rom</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pflash</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='readonly'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>yes</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>no</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='secure'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>no</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </loader>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </os>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='host-passthrough' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='hostPassthroughMigratable'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>on</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>off</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='maximum' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='maximumMigratable'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>on</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>off</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='host-model' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <vendor>AMD</vendor>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='x2apic'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='hypervisor'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='stibp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='overflow-recov'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='succor'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='lbrv'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc-scale'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='flushbyasid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='pause-filter'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='pfthreshold'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='disable' name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='custom' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='ClearwaterForest'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ddpd-u'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sha512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='ClearwaterForest-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ddpd-u'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sha512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Dhyana-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Turin'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbpb'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Turin-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbpb'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-128'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-256'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-128'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-256'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v6'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v7'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='KnightsMill'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512er'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512pf'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='KnightsMill-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512er'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512pf'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G4-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tbm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G5-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tbm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='athlon'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='athlon-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='core2duo'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='core2duo-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='coreduo'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='coreduo-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='n270'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='n270-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='phenom'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='phenom-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <memoryBacking supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <enum name='sourceType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>file</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>anonymous</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>memfd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </memoryBacking>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <disk supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='diskDevice'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>disk</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>cdrom</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>floppy</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>lun</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='bus'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>fdc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>scsi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>sata</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-non-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <graphics supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vnc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>egl-headless</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dbus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </graphics>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <video supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='modelType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vga</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>cirrus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>none</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>bochs</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>ramfb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </video>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <hostdev supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='mode'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>subsystem</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='startupPolicy'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>default</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>mandatory</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>requisite</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>optional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='subsysType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pci</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>scsi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='capsType'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='pciBackend'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </hostdev>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <rng supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-non-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>random</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>egd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>builtin</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <filesystem supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='driverType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>path</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>handle</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtiofs</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </filesystem>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <tpm supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tpm-tis</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tpm-crb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>emulator</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>external</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendVersion'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>2.0</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </tpm>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <redirdev supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='bus'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </redirdev>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <channel supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pty</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>unix</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </channel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <crypto supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>qemu</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>builtin</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </crypto>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <interface supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>default</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>passt</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <panic supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>isa</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>hyperv</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </panic>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <console supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>null</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pty</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dev</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>file</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pipe</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>stdio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>udp</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tcp</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>unix</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>qemu-vdagent</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dbus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </console>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <features>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <gic supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <vmcoreinfo supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <genid supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <backingStoreInput supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <backup supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <async-teardown supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <s390-pv supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <ps2 supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <tdx supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <sev supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <sgx supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <hyperv supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='features'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>relaxed</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vapic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>spinlocks</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vpindex</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>runtime</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>synic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>stimer</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>reset</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vendor_id</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>frequencies</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>reenlightenment</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tlbflush</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>ipi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>avic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>emsr_bitmap</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>xmm_input</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <defaults>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <spinlocks>4095</spinlocks>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <stimer_direct>on</stimer_direct>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </defaults>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </hyperv>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <launchSecurity supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </features>
Feb 16 17:16:27 compute-0 nova_compute[186176]: </domainCapabilities>
Feb 16 17:16:27 compute-0 nova_compute[186176]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.338 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.340 186180 DEBUG nova.virt.libvirt.volume.mount [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.344 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 16 17:16:27 compute-0 nova_compute[186176]: <domainCapabilities>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <domain>kvm</domain>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <arch>x86_64</arch>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <vcpu max='240'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <iothreads supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <os supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <enum name='firmware'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <loader supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>rom</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pflash</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='readonly'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>yes</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>no</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='secure'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>no</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </loader>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </os>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='host-passthrough' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='hostPassthroughMigratable'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>on</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>off</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='maximum' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='maximumMigratable'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>on</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>off</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='host-model' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <vendor>AMD</vendor>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='x2apic'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='hypervisor'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='stibp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='overflow-recov'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='succor'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='lbrv'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc-scale'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='flushbyasid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='pause-filter'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='pfthreshold'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='disable' name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='custom' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='ClearwaterForest'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ddpd-u'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sha512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='ClearwaterForest-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ddpd-u'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sha512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Dhyana-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Turin'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbpb'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Turin-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbpb'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-128'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-256'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-128'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-256'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v6'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v7'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='KnightsMill'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512er'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512pf'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='KnightsMill-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512er'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512pf'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G4-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tbm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G5-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tbm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='athlon'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='athlon-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='core2duo'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='core2duo-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='coreduo'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='coreduo-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='n270'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='n270-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='phenom'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='phenom-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <memoryBacking supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <enum name='sourceType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>file</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>anonymous</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>memfd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </memoryBacking>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <disk supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='diskDevice'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>disk</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>cdrom</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>floppy</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>lun</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='bus'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>ide</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>fdc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>scsi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>sata</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-non-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <graphics supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vnc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>egl-headless</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dbus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </graphics>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <video supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='modelType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vga</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>cirrus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>none</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>bochs</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>ramfb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </video>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <hostdev supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='mode'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>subsystem</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='startupPolicy'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>default</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>mandatory</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>requisite</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>optional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='subsysType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pci</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>scsi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='capsType'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='pciBackend'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </hostdev>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <rng supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-non-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>random</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>egd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>builtin</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <filesystem supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='driverType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>path</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>handle</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtiofs</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </filesystem>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <tpm supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tpm-tis</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tpm-crb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>emulator</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>external</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendVersion'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>2.0</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </tpm>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <redirdev supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='bus'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </redirdev>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <channel supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pty</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>unix</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </channel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <crypto supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>qemu</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>builtin</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </crypto>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <interface supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>default</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>passt</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <panic supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>isa</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>hyperv</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </panic>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <console supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>null</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pty</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dev</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>file</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pipe</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>stdio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>udp</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tcp</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>unix</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>qemu-vdagent</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dbus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </console>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <features>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <gic supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <vmcoreinfo supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <genid supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <backingStoreInput supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <backup supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <async-teardown supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <s390-pv supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <ps2 supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <tdx supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <sev supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <sgx supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <hyperv supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='features'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>relaxed</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vapic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>spinlocks</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vpindex</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>runtime</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>synic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>stimer</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>reset</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vendor_id</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>frequencies</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>reenlightenment</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tlbflush</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>ipi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>avic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>emsr_bitmap</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>xmm_input</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <defaults>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <spinlocks>4095</spinlocks>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <stimer_direct>on</stimer_direct>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </defaults>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </hyperv>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <launchSecurity supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </features>
Feb 16 17:16:27 compute-0 nova_compute[186176]: </domainCapabilities>
Feb 16 17:16:27 compute-0 nova_compute[186176]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.423 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 16 17:16:27 compute-0 nova_compute[186176]: <domainCapabilities>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <path>/usr/libexec/qemu-kvm</path>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <domain>kvm</domain>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <arch>x86_64</arch>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <vcpu max='4096'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <iothreads supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <os supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <enum name='firmware'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>efi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <loader supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>rom</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pflash</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='readonly'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>yes</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>no</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='secure'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>yes</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>no</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </loader>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </os>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='host-passthrough' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='hostPassthroughMigratable'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>on</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>off</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='maximum' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='maximumMigratable'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>on</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>off</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='host-model' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <vendor>AMD</vendor>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='x2apic'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc-deadline'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='hypervisor'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc_adjust'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='spec-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='stibp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='cmp_legacy'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='overflow-recov'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='succor'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='amd-ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='virt-ssbd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='lbrv'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='tsc-scale'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='vmcb-clean'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='flushbyasid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='pause-filter'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='pfthreshold'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='svme-addr-chk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <feature policy='disable' name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <mode name='custom' supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Broadwell-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cascadelake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='ClearwaterForest'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ddpd-u'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sha512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='ClearwaterForest-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ddpd-u'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sha512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm3'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sm4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Cooperlake-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Denverton-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Dhyana-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Genoa-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Milan-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Rome-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Turin'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbpb'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-Turin-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amd-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='auto-ibrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vp2intersect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fs-gs-base-ns'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibpb-brtype'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='no-nested-data-bp'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='null-sel-clr-base'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='perfmon-v2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbpb'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='srso-user-kernel-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='stibp-always-on'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='EPYC-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-128'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-256'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='GraniteRapids-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-128'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-256'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx10-512'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='prefetchiti'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Haswell-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-noTSX'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v6'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Icelake-Server-v7'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='IvyBridge-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='KnightsMill'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512er'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512pf'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='KnightsMill-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4fmaps'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-4vnniw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512er'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512pf'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G4-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tbm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Opteron_G5-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fma4'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tbm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xop'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SapphireRapids-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='amx-tile'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-bf16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-fp16'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512-vpopcntdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bitalg'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vbmi2'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrc'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fzrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='la57'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='taa-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='tsx-ldtrk'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='SierraForest-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ifma'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-ne-convert'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx-vnni-int8'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bhi-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='bus-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cmpccxadd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fbsdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='fsrs'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ibrs-all'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='intel-psfd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ipred-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='lam'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mcdt-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pbrsb-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='psdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rrsba-ctrl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='sbdr-ssdp-no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='serialize'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vaes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='vpclmulqdq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Client-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='hle'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='rtm'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Skylake-Server-v5'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512bw'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512cd'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512dq'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512f'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='avx512vl'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='invpcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pcid'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='pku'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='mpx'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v2'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v3'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='core-capability'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='split-lock-detect'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='Snowridge-v4'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='cldemote'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='erms'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='gfni'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdir64b'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='movdiri'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='xsaves'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='athlon'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='athlon-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='core2duo'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='core2duo-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='coreduo'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='coreduo-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='n270'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='n270-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='ss'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='phenom'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <blockers model='phenom-v1'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnow'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <feature name='3dnowext'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </blockers>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </mode>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <memoryBacking supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <enum name='sourceType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>file</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>anonymous</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <value>memfd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </memoryBacking>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <disk supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='diskDevice'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>disk</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>cdrom</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>floppy</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>lun</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='bus'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>fdc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>scsi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>sata</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-non-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <graphics supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vnc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>egl-headless</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dbus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </graphics>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <video supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='modelType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vga</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>cirrus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>none</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>bochs</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>ramfb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </video>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <hostdev supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='mode'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>subsystem</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='startupPolicy'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>default</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>mandatory</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>requisite</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>optional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='subsysType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pci</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>scsi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='capsType'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='pciBackend'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </hostdev>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <rng supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtio-non-transitional</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>random</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>egd</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>builtin</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <filesystem supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='driverType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>path</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>handle</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>virtiofs</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </filesystem>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <tpm supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tpm-tis</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tpm-crb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>emulator</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>external</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendVersion'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>2.0</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </tpm>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <redirdev supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='bus'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>usb</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </redirdev>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <channel supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pty</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>unix</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </channel>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <crypto supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>qemu</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendModel'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>builtin</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </crypto>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <interface supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='backendType'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>default</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>passt</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <panic supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='model'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>isa</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>hyperv</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </panic>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <console supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='type'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>null</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vc</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pty</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dev</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>file</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>pipe</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>stdio</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>udp</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tcp</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>unix</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>qemu-vdagent</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>dbus</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </console>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <features>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <gic supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <vmcoreinfo supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <genid supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <backingStoreInput supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <backup supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <async-teardown supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <s390-pv supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <ps2 supported='yes'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <tdx supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <sev supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <sgx supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <hyperv supported='yes'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <enum name='features'>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>relaxed</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vapic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>spinlocks</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vpindex</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>runtime</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>synic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>stimer</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>reset</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>vendor_id</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>frequencies</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>reenlightenment</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>tlbflush</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>ipi</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>avic</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>emsr_bitmap</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <value>xmm_input</value>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </enum>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       <defaults>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <spinlocks>4095</spinlocks>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <stimer_direct>on</stimer_direct>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <tlbflush_direct>on</tlbflush_direct>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <tlbflush_extended>on</tlbflush_extended>
Feb 16 17:16:27 compute-0 nova_compute[186176]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 16 17:16:27 compute-0 nova_compute[186176]:       </defaults>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     </hyperv>
Feb 16 17:16:27 compute-0 nova_compute[186176]:     <launchSecurity supported='no'/>
Feb 16 17:16:27 compute-0 nova_compute[186176]:   </features>
Feb 16 17:16:27 compute-0 nova_compute[186176]: </domainCapabilities>
Feb 16 17:16:27 compute-0 nova_compute[186176]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.494 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.495 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.495 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.500 186180 INFO nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Secure Boot support detected
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.503 186180 INFO nova.virt.libvirt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.503 186180 INFO nova.virt.libvirt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.510 186180 DEBUG nova.virt.libvirt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] cpu compare xml: <cpu match="exact">
Feb 16 17:16:27 compute-0 nova_compute[186176]:   <model>Nehalem</model>
Feb 16 17:16:27 compute-0 nova_compute[186176]: </cpu>
Feb 16 17:16:27 compute-0 nova_compute[186176]:  _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.512 186180 DEBUG nova.virt.libvirt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.532 186180 INFO nova.virt.node [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Determined node identity bb904aac-529f-46ef-9861-9c655a4b383c from /var/lib/nova/compute_id
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.552 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Verified node bb904aac-529f-46ef-9861-9c655a4b383c matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.573 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.677 186180 DEBUG oslo_concurrency.lockutils [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.677 186180 DEBUG oslo_concurrency.lockutils [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.677 186180 DEBUG oslo_concurrency.lockutils [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.678 186180 DEBUG nova.compute.resource_tracker [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.811 186180 WARNING nova.virt.libvirt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.812 186180 DEBUG nova.compute.resource_tracker [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6124MB free_disk=73.43842315673828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.812 186180 DEBUG oslo_concurrency.lockutils [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.812 186180 DEBUG oslo_concurrency.lockutils [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.919 186180 DEBUG nova.compute.resource_tracker [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.919 186180 DEBUG nova.compute.resource_tracker [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.968 186180 DEBUG nova.scheduler.client.report [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.984 186180 DEBUG nova.scheduler.client.report [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 17:16:27 compute-0 nova_compute[186176]: 2026-02-16 17:16:27.984 186180 DEBUG nova.compute.provider_tree [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:16:27 compute-0 rsyslogd[1020]: imjournal from <np0005621130:nova_compute>: begin to drop messages due to rate-limiting
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.001 186180 DEBUG nova.scheduler.client.report [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.025 186180 DEBUG nova.scheduler.client.report [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.051 186180 DEBUG nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 16 17:16:28 compute-0 nova_compute[186176]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.052 186180 INFO nova.virt.libvirt.host [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] kernel doesn't support AMD SEV
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.053 186180 DEBUG nova.compute.provider_tree [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.054 186180 DEBUG nova.virt.libvirt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.057 186180 DEBUG nova.virt.libvirt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Libvirt baseline CPU <cpu>
Feb 16 17:16:28 compute-0 nova_compute[186176]:   <arch>x86_64</arch>
Feb 16 17:16:28 compute-0 nova_compute[186176]:   <model>Nehalem</model>
Feb 16 17:16:28 compute-0 nova_compute[186176]:   <vendor>AMD</vendor>
Feb 16 17:16:28 compute-0 nova_compute[186176]:   <topology sockets="8" cores="1" threads="1"/>
Feb 16 17:16:28 compute-0 nova_compute[186176]: </cpu>
Feb 16 17:16:28 compute-0 nova_compute[186176]:  _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.089 186180 DEBUG nova.scheduler.client.report [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.133 186180 DEBUG nova.compute.resource_tracker [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.133 186180 DEBUG oslo_concurrency.lockutils [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.321s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.134 186180 DEBUG nova.service [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.170 186180 DEBUG nova.service [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 16 17:16:28 compute-0 nova_compute[186176]: 2026-02-16 17:16:28.171 186180 DEBUG nova.servicegroup.drivers.db [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 16 17:16:30 compute-0 sshd-session[186477]: Accepted publickey for zuul from 192.168.122.30 port 54026 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:16:30 compute-0 systemd-logind[821]: New session 26 of user zuul.
Feb 16 17:16:30 compute-0 systemd[1]: Started Session 26 of User zuul.
Feb 16 17:16:30 compute-0 sshd-session[186477]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:16:31 compute-0 python3.9[186630]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 16 17:16:32 compute-0 sudo[186784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpcsioejgsxibfxbzdkfdvaadycouopr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262192.3576741-52-259728080432357/AnsiballZ_systemd_service.py'
Feb 16 17:16:32 compute-0 sudo[186784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:33 compute-0 python3.9[186786]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:16:33 compute-0 systemd[1]: Reloading.
Feb 16 17:16:33 compute-0 systemd-rc-local-generator[186816]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:16:33 compute-0 systemd-sysv-generator[186823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:16:33 compute-0 sudo[186784]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:34 compute-0 python3.9[186979]: ansible-ansible.builtin.service_facts Invoked
Feb 16 17:16:34 compute-0 network[186996]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 16 17:16:34 compute-0 network[186997]: 'network-scripts' will be removed from distribution in near future.
Feb 16 17:16:34 compute-0 network[186998]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 16 17:16:38 compute-0 sudo[187269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmfqxcixknwujrixxwtefhyshtnwsavw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262197.7459016-90-114123916704877/AnsiballZ_systemd_service.py'
Feb 16 17:16:38 compute-0 sudo[187269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:16:38.141 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:16:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:16:38.143 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:16:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:16:38.144 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:16:38 compute-0 python3.9[187271]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:16:38 compute-0 sudo[187269]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:39 compute-0 sudo[187422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-towzwusaocovkjdynpjvdlhitpppqeqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262198.6675062-110-155436241661179/AnsiballZ_file.py'
Feb 16 17:16:39 compute-0 sudo[187422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:39 compute-0 python3.9[187424]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:39 compute-0 sudo[187422]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:39 compute-0 rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 17:16:39 compute-0 rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 17:16:39 compute-0 sudo[187575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjvjyfhvbpydpfwdcrolouziaonvztkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262199.4391084-126-183408891024630/AnsiballZ_file.py'
Feb 16 17:16:39 compute-0 sudo[187575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:39 compute-0 python3.9[187577]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:39 compute-0 sudo[187575]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:40 compute-0 sudo[187727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fivfrkjywsegbwrmkzhlbrjsgnaddfvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262200.080269-144-237948903848569/AnsiballZ_command.py'
Feb 16 17:16:40 compute-0 sudo[187727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:40 compute-0 python3.9[187729]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:16:40 compute-0 sudo[187727]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:41 compute-0 python3.9[187881]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 17:16:42 compute-0 sudo[188031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrlynegkeetwhyxbpglpzknqdfmlxarx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262201.7333724-180-55774702705777/AnsiballZ_systemd_service.py'
Feb 16 17:16:42 compute-0 sudo[188031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:42 compute-0 python3.9[188033]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:16:42 compute-0 systemd[1]: Reloading.
Feb 16 17:16:42 compute-0 systemd-rc-local-generator[188060]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:16:42 compute-0 systemd-sysv-generator[188064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:16:42 compute-0 sudo[188031]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:42 compute-0 podman[188075]: 2026-02-16 17:16:42.76306535 +0000 UTC m=+0.060934499 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 17:16:43 compute-0 sudo[188243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpjzobmivzliqhexfupddjmfeiqvfzsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262202.856342-196-101560411652201/AnsiballZ_command.py'
Feb 16 17:16:43 compute-0 sudo[188243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:43 compute-0 nova_compute[186176]: 2026-02-16 17:16:43.174 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:16:43 compute-0 nova_compute[186176]: 2026-02-16 17:16:43.250 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:16:43 compute-0 python3.9[188245]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:16:43 compute-0 sudo[188243]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:43 compute-0 sudo[188396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tknaseybwuxxcicnvvumgsizutoemeau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262203.5125017-214-171568950571268/AnsiballZ_file.py'
Feb 16 17:16:43 compute-0 sudo[188396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:43 compute-0 python3.9[188398]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:16:43 compute-0 sudo[188396]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:44 compute-0 python3.9[188548]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:16:45 compute-0 sudo[188700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebjzmwjaxqwptszsznqvikzpqpkglkmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262204.8603337-246-169509373980074/AnsiballZ_group.py'
Feb 16 17:16:45 compute-0 sudo[188700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:45 compute-0 python3.9[188702]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 16 17:16:45 compute-0 sudo[188700]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:46 compute-0 sudo[188852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssrirxuqvveerdpqhlmdwyqpfrjxomyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262205.789744-268-273229343224741/AnsiballZ_getent.py'
Feb 16 17:16:46 compute-0 sudo[188852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:46 compute-0 python3.9[188854]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 16 17:16:46 compute-0 sudo[188852]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:46 compute-0 sudo[189005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtjkkqkqcowpzkridggctfspctvyrwss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262206.6022549-284-94392647306192/AnsiballZ_group.py'
Feb 16 17:16:46 compute-0 sudo[189005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:46 compute-0 python3.9[189007]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 16 17:16:47 compute-0 groupadd[189008]: group added to /etc/group: name=ceilometer, GID=42405
Feb 16 17:16:47 compute-0 groupadd[189008]: group added to /etc/gshadow: name=ceilometer
Feb 16 17:16:47 compute-0 groupadd[189008]: new group: name=ceilometer, GID=42405
Feb 16 17:16:47 compute-0 sudo[189005]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:47 compute-0 sudo[189163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gewopejvzqfcwjrsqokyyylulxumpgvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262207.2004604-300-136536485663349/AnsiballZ_user.py'
Feb 16 17:16:47 compute-0 sudo[189163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:16:47 compute-0 python3.9[189165]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 16 17:16:47 compute-0 useradd[189167]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Feb 16 17:16:47 compute-0 useradd[189167]: add 'ceilometer' to group 'libvirt'
Feb 16 17:16:47 compute-0 useradd[189167]: add 'ceilometer' to shadow group 'libvirt'
Feb 16 17:16:48 compute-0 sudo[189163]: pam_unix(sudo:session): session closed for user root
Feb 16 17:16:49 compute-0 podman[189297]: 2026-02-16 17:16:49.117973801 +0000 UTC m=+0.094334407 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 16 17:16:49 compute-0 python3.9[189336]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:49 compute-0 python3.9[189472]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771262208.795701-352-60381331402816/.source.conf _original_basename=ceilometer.conf follow=False checksum=5c6a9288d15d1b05b1484826ce363ad306e9930c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:50 compute-0 python3.9[189622]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:50 compute-0 python3.9[189743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771262210.0204637-352-150830248529093/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:51 compute-0 python3.9[189893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:52 compute-0 python3.9[190014]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771262211.0914283-352-156236165464711/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:52 compute-0 python3.9[190164]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:16:53 compute-0 python3.9[190316]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:16:53 compute-0 python3.9[190468]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:54 compute-0 python3.9[190589]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771262213.419271-470-96338332433498/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:16:55 compute-0 python3.9[190739]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:55 compute-0 python3.9[190860]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771262214.6778097-470-209715707768742/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:16:56 compute-0 python3.9[191010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:56 compute-0 python3.9[191131]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771262215.807966-528-212664391949066/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:16:57 compute-0 python3.9[191281]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:57 compute-0 python3.9[191402]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262216.9555695-560-118321397169976/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:58 compute-0 python3.9[191552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:16:59 compute-0 python3.9[191673]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262218.1421294-590-119694416617688/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:16:59 compute-0 python3.9[191823]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:17:00 compute-0 python3.9[191944]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262219.379771-620-153523052475292/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:00 compute-0 sudo[192094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxenmsgrxvyvqeupipioqsawyvltimdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262220.5039396-650-62888638328194/AnsiballZ_file.py'
Feb 16 17:17:00 compute-0 sudo[192094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:00 compute-0 python3.9[192096]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:00 compute-0 sudo[192094]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:01 compute-0 sudo[192246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkvewxxxlgmpklbqsxujsknldhnilvsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262221.070402-666-97123201875046/AnsiballZ_file.py'
Feb 16 17:17:01 compute-0 sudo[192246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:01 compute-0 python3.9[192248]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:01 compute-0 sudo[192246]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:02 compute-0 python3.9[192398]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:17:02 compute-0 python3.9[192550]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:17:03 compute-0 python3.9[192702]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:17:04 compute-0 sudo[192854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iltwwpofkpthgeawvxilrxglasfulyvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262223.8015592-730-107280368893918/AnsiballZ_file.py'
Feb 16 17:17:04 compute-0 sudo[192854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:04 compute-0 python3.9[192856]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:17:04 compute-0 sudo[192854]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:04 compute-0 sudo[193006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tisuhbdgojsqvvbwkulbrnuqgjrbjhwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262224.46946-746-245207183524978/AnsiballZ_systemd_service.py'
Feb 16 17:17:04 compute-0 sudo[193006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:05 compute-0 python3.9[193008]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:17:05 compute-0 systemd[1]: Reloading.
Feb 16 17:17:05 compute-0 systemd-sysv-generator[193042]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:17:05 compute-0 systemd-rc-local-generator[193037]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:17:05 compute-0 systemd[1]: Listening on Podman API Socket.
Feb 16 17:17:05 compute-0 sudo[193006]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:05 compute-0 sudo[193203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxeqgrdjadvksrsrotpcqtklzvxbutdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262225.694403-764-118787862249144/AnsiballZ_stat.py'
Feb 16 17:17:05 compute-0 sudo[193203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:06 compute-0 python3.9[193205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:17:06 compute-0 sudo[193203]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:06 compute-0 sudo[193326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcafubwvxagylaraffjptaamjqodoztz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262225.694403-764-118787862249144/AnsiballZ_copy.py'
Feb 16 17:17:06 compute-0 sudo[193326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:06 compute-0 python3.9[193328]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771262225.694403-764-118787862249144/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:17:06 compute-0 sudo[193326]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:07 compute-0 sudo[193478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylwtqsltcsdyqoegvkuxooptsysrvkwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262227.3890443-806-83172553570997/AnsiballZ_file.py'
Feb 16 17:17:07 compute-0 sudo[193478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:07 compute-0 python3.9[193480]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:07 compute-0 sudo[193478]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:08 compute-0 sudo[193630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlypmnqfnhrfbdlyrppwyliuyjpkxrmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262228.1008792-822-50955185770311/AnsiballZ_file.py'
Feb 16 17:17:08 compute-0 sudo[193630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:08 compute-0 python3.9[193632]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:17:08 compute-0 sudo[193630]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:09 compute-0 python3.9[193782]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:11 compute-0 sudo[194203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hftxuubltbwsyflsljpclushzwtvqhnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262230.594375-890-217730225000859/AnsiballZ_container_config_data.py'
Feb 16 17:17:11 compute-0 sudo[194203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:11 compute-0 python3.9[194205]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 16 17:17:11 compute-0 sudo[194203]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:12 compute-0 sudo[194355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zubfcuacuafxnclybiczxrrijefvondz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262231.507144-912-37645635343254/AnsiballZ_container_config_hash.py'
Feb 16 17:17:12 compute-0 sudo[194355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:12 compute-0 python3.9[194357]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 17:17:12 compute-0 sudo[194355]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:13 compute-0 podman[194457]: 2026-02-16 17:17:13.113724074 +0000 UTC m=+0.075377727 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 16 17:17:13 compute-0 sudo[194527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkpkuybkcaffmgntyvaezfzgpebqasej ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771262232.6659586-932-98943947834529/AnsiballZ_edpm_container_manage.py'
Feb 16 17:17:13 compute-0 sudo[194527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:13 compute-0 python3[194530]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 17:17:15 compute-0 podman[194544]: 2026-02-16 17:17:15.149345386 +0000 UTC m=+1.639161638 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 16 17:17:15 compute-0 podman[194641]: 2026-02-16 17:17:15.286250395 +0000 UTC m=+0.056002497 container create a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter)
Feb 16 17:17:15 compute-0 podman[194641]: 2026-02-16 17:17:15.254454718 +0000 UTC m=+0.024206870 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 16 17:17:15 compute-0 python3[194530]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Feb 16 17:17:15 compute-0 sudo[194527]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:16 compute-0 sudo[194829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qifgwvzvsvecohldpyvpjmucvjnkhokz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262236.5861237-948-278422518588750/AnsiballZ_stat.py'
Feb 16 17:17:16 compute-0 sudo[194829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:17 compute-0 python3.9[194831]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:17:17 compute-0 sudo[194829]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:17 compute-0 sudo[194983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gflqpetkumyavwdhrraxvzxdqsdpuzrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262237.2652469-966-182888780982841/AnsiballZ_file.py'
Feb 16 17:17:17 compute-0 sudo[194983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:17 compute-0 python3.9[194985]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:17 compute-0 sudo[194983]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:17 compute-0 sudo[195059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjemxghfpmfjnqnwfrsgdlahonglowxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262237.2652469-966-182888780982841/AnsiballZ_stat.py'
Feb 16 17:17:17 compute-0 sudo[195059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:18 compute-0 python3.9[195061]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:17:18 compute-0 sudo[195059]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:18 compute-0 sudo[195210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nowjbqnmnlsjqpwdwqdxlrkoqmuvdysh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262238.1905637-966-156440010320196/AnsiballZ_copy.py'
Feb 16 17:17:18 compute-0 sudo[195210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:18 compute-0 python3.9[195212]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771262238.1905637-966-156440010320196/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:18 compute-0 sudo[195210]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:19 compute-0 sudo[195299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uysmmajhfhelwjpduzmepfaaknbegpot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262238.1905637-966-156440010320196/AnsiballZ_systemd.py'
Feb 16 17:17:19 compute-0 sudo[195299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:19 compute-0 podman[195260]: 2026-02-16 17:17:19.371533119 +0000 UTC m=+0.134418628 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 17:17:19 compute-0 python3.9[195307]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:17:19 compute-0 systemd[1]: Reloading.
Feb 16 17:17:19 compute-0 systemd-rc-local-generator[195340]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:17:19 compute-0 systemd-sysv-generator[195345]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:17:19 compute-0 sudo[195299]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:20 compute-0 sudo[195429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blsthofbizmtxbjfzxynezmqwwxggsix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262238.1905637-966-156440010320196/AnsiballZ_systemd.py'
Feb 16 17:17:20 compute-0 sudo[195429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:20 compute-0 python3.9[195431]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:17:20 compute-0 systemd[1]: Reloading.
Feb 16 17:17:20 compute-0 systemd-rc-local-generator[195462]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:17:20 compute-0 systemd-sysv-generator[195465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:17:20 compute-0 systemd[1]: Starting podman_exporter container...
Feb 16 17:17:20 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:17:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/597b0a05d978572d9cf7317af0e04652036967d56fcba053e66e66f6e909a245/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 16 17:17:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/597b0a05d978572d9cf7317af0e04652036967d56fcba053e66e66f6e909a245/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 16 17:17:20 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e.
Feb 16 17:17:20 compute-0 podman[195478]: 2026-02-16 17:17:20.911640766 +0000 UTC m=+0.129660391 container init a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:17:20 compute-0 podman_exporter[195494]: ts=2026-02-16T17:17:20.927Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 16 17:17:20 compute-0 podman_exporter[195494]: ts=2026-02-16T17:17:20.927Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 16 17:17:20 compute-0 podman_exporter[195494]: ts=2026-02-16T17:17:20.927Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 16 17:17:20 compute-0 podman_exporter[195494]: ts=2026-02-16T17:17:20.928Z caller=handler.go:105 level=info collector=container
Feb 16 17:17:20 compute-0 podman[195478]: 2026-02-16 17:17:20.942576011 +0000 UTC m=+0.160595626 container start a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:17:20 compute-0 podman[195478]: podman_exporter
Feb 16 17:17:20 compute-0 systemd[1]: Starting Podman API Service...
Feb 16 17:17:20 compute-0 systemd[1]: Started podman_exporter container.
Feb 16 17:17:20 compute-0 systemd[1]: Started Podman API Service.
Feb 16 17:17:20 compute-0 podman[195505]: time="2026-02-16T17:17:20Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 16 17:17:20 compute-0 podman[195505]: time="2026-02-16T17:17:20Z" level=info msg="Setting parallel job count to 25"
Feb 16 17:17:20 compute-0 podman[195505]: time="2026-02-16T17:17:20Z" level=info msg="Using sqlite as database backend"
Feb 16 17:17:20 compute-0 podman[195505]: time="2026-02-16T17:17:20Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 16 17:17:20 compute-0 podman[195505]: time="2026-02-16T17:17:20Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 16 17:17:20 compute-0 podman[195505]: time="2026-02-16T17:17:20Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Feb 16 17:17:20 compute-0 sudo[195429]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:20 compute-0 podman[195505]: @ - - [16/Feb/2026:17:17:20 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 16 17:17:20 compute-0 podman[195505]: time="2026-02-16T17:17:20Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:17:21 compute-0 podman[195505]: @ - - [16/Feb/2026:17:17:20 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 12585 "" "Go-http-client/1.1"
Feb 16 17:17:21 compute-0 podman_exporter[195494]: ts=2026-02-16T17:17:21.008Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 16 17:17:21 compute-0 podman_exporter[195494]: ts=2026-02-16T17:17:21.009Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 16 17:17:21 compute-0 podman_exporter[195494]: ts=2026-02-16T17:17:21.009Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Feb 16 17:17:21 compute-0 podman[195504]: 2026-02-16 17:17:21.01483001 +0000 UTC m=+0.064022546 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:17:21 compute-0 systemd[1]: a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e-64247ccb60277628.service: Main process exited, code=exited, status=1/FAILURE
Feb 16 17:17:21 compute-0 systemd[1]: a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e-64247ccb60277628.service: Failed with result 'exit-code'.
Feb 16 17:17:21 compute-0 python3.9[195690]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 17:17:22 compute-0 sudo[195840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkqfvgbaggzslrqeyriprcgorodjeblk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262242.4575417-1056-217805435919285/AnsiballZ_stat.py'
Feb 16 17:17:22 compute-0 sudo[195840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:22 compute-0 python3.9[195842]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:17:22 compute-0 sudo[195840]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:23 compute-0 sudo[195965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnubqpkzqowdxjpjvpnvklacnyuxdfhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262242.4575417-1056-217805435919285/AnsiballZ_copy.py'
Feb 16 17:17:23 compute-0 sudo[195965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:23 compute-0 python3.9[195967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262242.4575417-1056-217805435919285/.source.yaml _original_basename=.q4ayc4_1 follow=False checksum=08edb83c9a3ab263e8eb9d61038618e27d1b94a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:23 compute-0 sudo[195965]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:24 compute-0 sudo[196117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdihlebfwcexkgteclpsjtdqtuvqzojj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262243.6708426-1086-218044923710567/AnsiballZ_stat.py'
Feb 16 17:17:24 compute-0 sudo[196117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:24 compute-0 python3.9[196119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:17:24 compute-0 sudo[196117]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:24 compute-0 sudo[196240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgcpvavteacwedwnvagngmmfmwlvkjzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262243.6708426-1086-218044923710567/AnsiballZ_copy.py'
Feb 16 17:17:24 compute-0 sudo[196240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:24 compute-0 python3.9[196242]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771262243.6708426-1086-218044923710567/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:17:24 compute-0 sudo[196240]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:25 compute-0 rsyslogd[1020]: imjournal: 1940 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 16 17:17:26 compute-0 sudo[196392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmouduaetwwihbdzhurvstswarhzspqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262245.935762-1128-26524679825877/AnsiballZ_file.py'
Feb 16 17:17:26 compute-0 sudo[196392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.321 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.321 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.321 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.322 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:17:26 compute-0 python3.9[196394]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.333 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.333 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.333 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.334 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.334 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.334 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.334 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.335 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.335 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.356 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.357 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.357 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.357 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:17:26 compute-0 sudo[196392]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.485 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.487 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6037MB free_disk=73.39707946777344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.487 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.488 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.552 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.553 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.578 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.590 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.591 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:17:26 compute-0 nova_compute[186176]: 2026-02-16 17:17:26.591 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:17:26 compute-0 sudo[196544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sivsivvjrwochqvukxlqimlryujftyqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262246.5095026-1144-220286374365543/AnsiballZ_file.py'
Feb 16 17:17:26 compute-0 sudo[196544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:26 compute-0 python3.9[196546]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 16 17:17:26 compute-0 sudo[196544]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:27 compute-0 python3.9[196696]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:29 compute-0 sudo[197117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhrqyjncfkgkygajejyphsdywhjinbob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262249.1991668-1212-36889119444709/AnsiballZ_container_config_data.py'
Feb 16 17:17:29 compute-0 sudo[197117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:29 compute-0 python3.9[197119]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 16 17:17:29 compute-0 sudo[197117]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:30 compute-0 sudo[197269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bswkiyervvwwrxrswwdfckmqtnaievvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262250.0121264-1234-166051921762250/AnsiballZ_container_config_hash.py'
Feb 16 17:17:30 compute-0 sudo[197269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:30 compute-0 python3.9[197271]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 16 17:17:30 compute-0 sudo[197269]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:31 compute-0 sudo[197421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfmweqapkramsnldvoueskjlqvueclvm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771262250.8174407-1254-174700914870130/AnsiballZ_edpm_container_manage.py'
Feb 16 17:17:31 compute-0 sudo[197421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:31 compute-0 python3[197423]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 16 17:17:33 compute-0 podman[197437]: 2026-02-16 17:17:33.699007728 +0000 UTC m=+2.292657527 image pull 8da9a5cb84d98cc9d82bfbfe59b1a8f3d35b219d7fadc752f19c50c8fa4c9c58 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 16 17:17:33 compute-0 podman[197533]: 2026-02-16 17:17:33.835024118 +0000 UTC m=+0.054576895 container create 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.7, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-type=git, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, distribution-scope=public, container_name=openstack_network_exporter, release=1770267347)
Feb 16 17:17:33 compute-0 podman[197533]: 2026-02-16 17:17:33.813219009 +0000 UTC m=+0.032771816 image pull 8da9a5cb84d98cc9d82bfbfe59b1a8f3d35b219d7fadc752f19c50c8fa4c9c58 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 16 17:17:33 compute-0 python3[197423]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 16 17:17:34 compute-0 sudo[197421]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:34 compute-0 sudo[197721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnpnrdesplwhliswfixzoywjtetbmuqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262254.1683311-1270-280789558495382/AnsiballZ_stat.py'
Feb 16 17:17:34 compute-0 sudo[197721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:34 compute-0 python3.9[197723]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:17:34 compute-0 sudo[197721]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:35 compute-0 sudo[197875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykmppnqdoitgaxvqreoncwabbimvozak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262254.871039-1288-147904141192711/AnsiballZ_file.py'
Feb 16 17:17:35 compute-0 sudo[197875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:35 compute-0 python3.9[197877]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:35 compute-0 sudo[197875]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:35 compute-0 sudo[197951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chkgmvgzndifqxwnzchuludpqtucfiej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262254.871039-1288-147904141192711/AnsiballZ_stat.py'
Feb 16 17:17:35 compute-0 sudo[197951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:35 compute-0 python3.9[197953]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:17:35 compute-0 sudo[197951]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:36 compute-0 sudo[198102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otrchfrszptdnthyrrepzmmwhzbynvpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262255.8387232-1288-72271403614024/AnsiballZ_copy.py'
Feb 16 17:17:36 compute-0 sudo[198102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:36 compute-0 python3.9[198104]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771262255.8387232-1288-72271403614024/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:36 compute-0 sudo[198102]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:36 compute-0 sudo[198178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdsrxnfezzjqfesmjvaupsiyjsjvaejc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262255.8387232-1288-72271403614024/AnsiballZ_systemd.py'
Feb 16 17:17:36 compute-0 sudo[198178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:36 compute-0 python3.9[198180]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 16 17:17:36 compute-0 systemd[1]: Reloading.
Feb 16 17:17:37 compute-0 systemd-sysv-generator[198210]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:17:37 compute-0 systemd-rc-local-generator[198203]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:17:37 compute-0 sudo[198178]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:37 compute-0 sudo[198295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxufieldumwzxnozuafjkkshrtseazid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262255.8387232-1288-72271403614024/AnsiballZ_systemd.py'
Feb 16 17:17:37 compute-0 sudo[198295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:37 compute-0 python3.9[198297]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 16 17:17:37 compute-0 systemd[1]: Reloading.
Feb 16 17:17:37 compute-0 systemd-rc-local-generator[198325]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 16 17:17:37 compute-0 systemd-sysv-generator[198331]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 16 17:17:38 compute-0 systemd[1]: Starting openstack_network_exporter container...
Feb 16 17:17:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:17:38.142 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:17:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:17:38.143 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:17:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:17:38.144 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:17:38 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:17:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bae323a3d8cf0c287f786d3865445b5fb66388e8c38b38b07389b492356377fe/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 16 17:17:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bae323a3d8cf0c287f786d3865445b5fb66388e8c38b38b07389b492356377fe/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 16 17:17:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bae323a3d8cf0c287f786d3865445b5fb66388e8c38b38b07389b492356377fe/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 16 17:17:38 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4.
Feb 16 17:17:38 compute-0 podman[198344]: 2026-02-16 17:17:38.271643423 +0000 UTC m=+0.134965455 container init 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., version=9.7)
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:48: registering *bridge.Collector
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:48: registering *coverage.Collector
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:48: registering *datapath.Collector
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:48: registering *iface.Collector
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:48: registering *memory.Collector
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:48: registering *ovn.Collector
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:48: registering *pmd_perf.Collector
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:48: registering *pmd_rxq.Collector
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: INFO    17:17:38 main.go:48: registering *vswitch.Collector
Feb 16 17:17:38 compute-0 openstack_network_exporter[198360]: NOTICE  17:17:38 main.go:76: listening on https://:9105/metrics
Feb 16 17:17:38 compute-0 podman[198344]: 2026-02-16 17:17:38.305966396 +0000 UTC m=+0.169288388 container start 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, version=9.7, maintainer=Red Hat, Inc., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Feb 16 17:17:38 compute-0 podman[198344]: openstack_network_exporter
Feb 16 17:17:38 compute-0 systemd[1]: Started openstack_network_exporter container.
Feb 16 17:17:38 compute-0 sudo[198295]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:38 compute-0 podman[198370]: 2026-02-16 17:17:38.378638579 +0000 UTC m=+0.064469345 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.expose-services=, architecture=x86_64, config_id=openstack_network_exporter, vcs-type=git, version=9.7)
Feb 16 17:17:38 compute-0 python3.9[198542]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 16 17:17:40 compute-0 sudo[198692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwededgulzwnvughvhouriuascsqhcin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262259.6180098-1378-98064489160017/AnsiballZ_stat.py'
Feb 16 17:17:40 compute-0 sudo[198692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:40 compute-0 python3.9[198694]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:17:40 compute-0 sudo[198692]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:40 compute-0 sudo[198817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qreezbfbbzrnytwfxomzgxwlwkghbmyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262259.6180098-1378-98064489160017/AnsiballZ_copy.py'
Feb 16 17:17:40 compute-0 sudo[198817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:40 compute-0 python3.9[198819]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262259.6180098-1378-98064489160017/.source.yaml _original_basename=.ine1i62b follow=False checksum=8f7fa9db9275833528826dcade5daef27e4a241e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:40 compute-0 sudo[198817]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:41 compute-0 sudo[198969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zotxwbdgwbgbclwirkisxcuqmwaeboxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262260.990393-1408-84862400658851/AnsiballZ_find.py'
Feb 16 17:17:41 compute-0 sudo[198969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:41 compute-0 python3.9[198971]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 16 17:17:41 compute-0 sudo[198969]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:42 compute-0 sudo[199121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bokvoaigzcdchvkeelrrzajawzburpzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262261.7832897-1427-120007669774643/AnsiballZ_podman_container_info.py'
Feb 16 17:17:42 compute-0 sudo[199121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:42 compute-0 python3.9[199123]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 16 17:17:42 compute-0 sudo[199121]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:43 compute-0 sudo[199294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iznnmpwovtlxqaxnbrqubekfwxpxsugx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262262.730663-1435-256582837801717/AnsiballZ_podman_container_exec.py'
Feb 16 17:17:43 compute-0 sudo[199294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:43 compute-0 podman[199260]: 2026-02-16 17:17:43.29080917 +0000 UTC m=+0.076181259 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 16 17:17:43 compute-0 python3.9[199305]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 17:17:43 compute-0 systemd[1]: Started libpod-conmon-6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73.scope.
Feb 16 17:17:43 compute-0 podman[199308]: 2026-02-16 17:17:43.534276486 +0000 UTC m=+0.082436261 container exec 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:17:43 compute-0 podman[199308]: 2026-02-16 17:17:43.568566018 +0000 UTC m=+0.116725823 container exec_died 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 17:17:43 compute-0 systemd[1]: libpod-conmon-6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73.scope: Deactivated successfully.
Feb 16 17:17:43 compute-0 sudo[199294]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:43 compute-0 sudo[199486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwslfrdrgqsstnezfkaoozwvxhgdynyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262263.7773104-1443-39999308798782/AnsiballZ_podman_container_exec.py'
Feb 16 17:17:44 compute-0 sudo[199486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:44 compute-0 python3.9[199488]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 17:17:44 compute-0 systemd[1]: Started libpod-conmon-6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73.scope.
Feb 16 17:17:44 compute-0 podman[199489]: 2026-02-16 17:17:44.281020481 +0000 UTC m=+0.076479446 container exec 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 17:17:44 compute-0 podman[199489]: 2026-02-16 17:17:44.310630219 +0000 UTC m=+0.106089184 container exec_died 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:17:44 compute-0 systemd[1]: libpod-conmon-6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73.scope: Deactivated successfully.
Feb 16 17:17:44 compute-0 sudo[199486]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:44 compute-0 sudo[199669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxnuysknujkzinlthvurnhfmvcorkwwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262264.5062277-1451-202722665009447/AnsiballZ_file.py'
Feb 16 17:17:44 compute-0 sudo[199669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:44 compute-0 python3.9[199671]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:44 compute-0 sudo[199669]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:45 compute-0 sudo[199821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztqqxothvuzzcsorlqpzekqqfvoumtkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262265.1990733-1460-250012081715547/AnsiballZ_podman_container_info.py'
Feb 16 17:17:45 compute-0 sudo[199821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:45 compute-0 python3.9[199823]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 16 17:17:45 compute-0 sudo[199821]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:46 compute-0 sudo[199986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phkbaqlezaihcldgnivfdirypgpxxxqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262265.8394403-1468-171981260041351/AnsiballZ_podman_container_exec.py'
Feb 16 17:17:46 compute-0 sudo[199986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:46 compute-0 python3.9[199988]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 17:17:46 compute-0 systemd[1]: Started libpod-conmon-216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907.scope.
Feb 16 17:17:46 compute-0 podman[199989]: 2026-02-16 17:17:46.424135039 +0000 UTC m=+0.079370216 container exec 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 16 17:17:46 compute-0 podman[200009]: 2026-02-16 17:17:46.488273525 +0000 UTC m=+0.052940565 container exec_died 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 17:17:46 compute-0 podman[199989]: 2026-02-16 17:17:46.493864271 +0000 UTC m=+0.149099448 container exec_died 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Feb 16 17:17:46 compute-0 systemd[1]: libpod-conmon-216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907.scope: Deactivated successfully.
Feb 16 17:17:46 compute-0 sudo[199986]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:46 compute-0 sudo[200171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfgqgdsmevwnehuxqwgchtfhoynljrss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262266.6615238-1476-254550844176796/AnsiballZ_podman_container_exec.py'
Feb 16 17:17:46 compute-0 sudo[200171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:47 compute-0 python3.9[200173]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 17:17:47 compute-0 systemd[1]: Started libpod-conmon-216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907.scope.
Feb 16 17:17:47 compute-0 podman[200174]: 2026-02-16 17:17:47.212691609 +0000 UTC m=+0.105654024 container exec 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 16 17:17:47 compute-0 podman[200174]: 2026-02-16 17:17:47.24285344 +0000 UTC m=+0.135815855 container exec_died 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:17:47 compute-0 systemd[1]: libpod-conmon-216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907.scope: Deactivated successfully.
Feb 16 17:17:47 compute-0 sudo[200171]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:47 compute-0 sudo[200355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fubkdqwewmpqezprlunlqdrehvpuxlaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262267.4321914-1484-25040006584396/AnsiballZ_file.py'
Feb 16 17:17:47 compute-0 sudo[200355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:47 compute-0 python3.9[200357]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:47 compute-0 sudo[200355]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:48 compute-0 sudo[200507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riuzjhhxathvinoecceltmldolestnym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262268.076131-1493-40913672718763/AnsiballZ_podman_container_info.py'
Feb 16 17:17:48 compute-0 sudo[200507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:48 compute-0 python3.9[200509]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 16 17:17:48 compute-0 sudo[200507]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:48 compute-0 sudo[200671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwzbqzmzvdykificxrouicfsifnwwszt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262268.713423-1501-17030221963776/AnsiballZ_podman_container_exec.py'
Feb 16 17:17:48 compute-0 sudo[200671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:49 compute-0 python3.9[200673]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 17:17:49 compute-0 systemd[1]: Started libpod-conmon-a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e.scope.
Feb 16 17:17:49 compute-0 podman[200674]: 2026-02-16 17:17:49.225936537 +0000 UTC m=+0.060100879 container exec a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:17:49 compute-0 podman[200694]: 2026-02-16 17:17:49.285213345 +0000 UTC m=+0.049830240 container exec_died a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 17:17:49 compute-0 podman[200674]: 2026-02-16 17:17:49.29776748 +0000 UTC m=+0.131931832 container exec_died a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 17:17:49 compute-0 systemd[1]: libpod-conmon-a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e.scope: Deactivated successfully.
Feb 16 17:17:49 compute-0 sudo[200671]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:49 compute-0 sudo[200868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blctjcfnbzyknastvensvwtkxprutdew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262269.4655292-1509-19462122248238/AnsiballZ_podman_container_exec.py'
Feb 16 17:17:49 compute-0 sudo[200868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:49 compute-0 podman[200831]: 2026-02-16 17:17:49.819957336 +0000 UTC m=+0.134054352 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:17:49 compute-0 python3.9[200875]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 17:17:50 compute-0 systemd[1]: Started libpod-conmon-a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e.scope.
Feb 16 17:17:50 compute-0 podman[200885]: 2026-02-16 17:17:50.029430218 +0000 UTC m=+0.096182324 container exec a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:17:50 compute-0 podman[200885]: 2026-02-16 17:17:50.060529322 +0000 UTC m=+0.127281328 container exec_died a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:17:50 compute-0 systemd[1]: libpod-conmon-a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e.scope: Deactivated successfully.
Feb 16 17:17:50 compute-0 sudo[200868]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:50 compute-0 sudo[201067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jslsqzwjrgyimxgxclextqkxgpmrmkcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262270.2564816-1517-82729015569632/AnsiballZ_file.py'
Feb 16 17:17:50 compute-0 sudo[201067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:50 compute-0 python3.9[201069]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:50 compute-0 sudo[201067]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:51 compute-0 sudo[201234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fukaakjtprrwhlyfhuhgoshvwuatqtyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262270.915332-1526-62765123765513/AnsiballZ_podman_container_info.py'
Feb 16 17:17:51 compute-0 sudo[201234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:51 compute-0 podman[201193]: 2026-02-16 17:17:51.233342163 +0000 UTC m=+0.061964814 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:17:51 compute-0 python3.9[201245]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 16 17:17:51 compute-0 sudo[201234]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:51 compute-0 sudo[201408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvtvondjynjcpwrbmntyqfrpqqprecwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262271.6705647-1534-256735833070214/AnsiballZ_podman_container_exec.py'
Feb 16 17:17:51 compute-0 sudo[201408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:52 compute-0 python3.9[201410]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 17:17:52 compute-0 systemd[1]: Started libpod-conmon-9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4.scope.
Feb 16 17:17:52 compute-0 podman[201411]: 2026-02-16 17:17:52.21554665 +0000 UTC m=+0.090884346 container exec 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1770267347, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 17:17:52 compute-0 podman[201411]: 2026-02-16 17:17:52.249580536 +0000 UTC m=+0.124918222 container exec_died 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, config_id=openstack_network_exporter, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Feb 16 17:17:52 compute-0 systemd[1]: libpod-conmon-9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4.scope: Deactivated successfully.
Feb 16 17:17:52 compute-0 sudo[201408]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:52 compute-0 sudo[201592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luoseluqxokwfruikiziqlbpfqraxzyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262272.428532-1542-214274513316083/AnsiballZ_podman_container_exec.py'
Feb 16 17:17:52 compute-0 sudo[201592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:52 compute-0 python3.9[201594]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 16 17:17:53 compute-0 systemd[1]: Started libpod-conmon-9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4.scope.
Feb 16 17:17:53 compute-0 podman[201595]: 2026-02-16 17:17:53.030072378 +0000 UTC m=+0.081344514 container exec 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, release=1770267347, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 17:17:53 compute-0 podman[201595]: 2026-02-16 17:17:53.060606968 +0000 UTC m=+0.111879014 container exec_died 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1770267347, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2026-02-05T04:57:10Z, distribution-scope=public)
Feb 16 17:17:53 compute-0 systemd[1]: libpod-conmon-9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4.scope: Deactivated successfully.
Feb 16 17:17:53 compute-0 sudo[201592]: pam_unix(sudo:session): session closed for user root
Feb 16 17:17:53 compute-0 sudo[201777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbxhzafjkqvvclakxowgzyksccszawtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262273.2556193-1550-5199389347001/AnsiballZ_file.py'
Feb 16 17:17:53 compute-0 sudo[201777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:17:53 compute-0 python3.9[201779]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:17:53 compute-0 sudo[201777]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:04 compute-0 sudo[201929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgvqtlxwbgelfuocgfalkvljvepihfad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262284.7011354-1692-187339701837382/AnsiballZ_file.py'
Feb 16 17:18:04 compute-0 sudo[201929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:05 compute-0 python3.9[201931]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:05 compute-0 sudo[201929]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:05 compute-0 sudo[202081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnpajguoijcmqlrkhejltkcziwuavpsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262285.2916517-1708-171239292937924/AnsiballZ_stat.py'
Feb 16 17:18:05 compute-0 sudo[202081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:05 compute-0 python3.9[202083]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:18:05 compute-0 sudo[202081]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:06 compute-0 sudo[202204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uevzoulczxouzsuvebaixwbslpyesapz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262285.2916517-1708-171239292937924/AnsiballZ_copy.py'
Feb 16 17:18:06 compute-0 sudo[202204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:06 compute-0 python3.9[202206]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771262285.2916517-1708-171239292937924/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:06 compute-0 sudo[202204]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:06 compute-0 sudo[202356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyfjuvooydidtpbaftzuoqnvtlxbldxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262286.4725256-1740-199869855311107/AnsiballZ_file.py'
Feb 16 17:18:06 compute-0 sudo[202356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:07 compute-0 python3.9[202358]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:07 compute-0 sudo[202356]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:07 compute-0 sudo[202508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhhlnxzaffkwobotuxxtyzywovtpzlfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262287.3850102-1756-170303857683247/AnsiballZ_stat.py'
Feb 16 17:18:07 compute-0 sudo[202508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:07 compute-0 python3.9[202510]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:18:07 compute-0 sudo[202508]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:08 compute-0 sudo[202586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxbddemmcvgefwhmfchrbdisbzwewwok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262287.3850102-1756-170303857683247/AnsiballZ_file.py'
Feb 16 17:18:08 compute-0 sudo[202586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:08 compute-0 python3.9[202588]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:08 compute-0 sudo[202586]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:09 compute-0 podman[202613]: 2026-02-16 17:18:09.097652273 +0000 UTC m=+0.066758311 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64)
Feb 16 17:18:10 compute-0 sudo[202761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goydxgdpmgksgxmuydgggfoykcrjpzuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262290.0503824-1780-148495887204422/AnsiballZ_stat.py'
Feb 16 17:18:10 compute-0 sudo[202761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:10 compute-0 python3.9[202763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:18:10 compute-0 sudo[202761]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:10 compute-0 sudo[202839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-antmhgyntrtqxexszgfvjfklvzuzudrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262290.0503824-1780-148495887204422/AnsiballZ_file.py'
Feb 16 17:18:10 compute-0 sudo[202839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:11 compute-0 python3.9[202841]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.r3hjedv2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:11 compute-0 sudo[202839]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:11 compute-0 sudo[202991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzztswqtztpikstrjrdxoesedfrstqbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262291.275618-1804-179623735850582/AnsiballZ_stat.py'
Feb 16 17:18:11 compute-0 sudo[202991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:11 compute-0 python3.9[202993]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:18:11 compute-0 sudo[202991]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:12 compute-0 sudo[203069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-busxzeuvrslozzpygamuczlpiiksugtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262291.275618-1804-179623735850582/AnsiballZ_file.py'
Feb 16 17:18:12 compute-0 sudo[203069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:12 compute-0 python3.9[203071]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:12 compute-0 sudo[203069]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:12 compute-0 sudo[203221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrqaqzrydlcllkeffgxugogzsatxtshx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262292.4422889-1830-159588096509754/AnsiballZ_command.py'
Feb 16 17:18:12 compute-0 sudo[203221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:12 compute-0 python3.9[203223]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:18:12 compute-0 sudo[203221]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:13 compute-0 sudo[203387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djrzsgiejxfpjyvyuplbstnazywmhuvr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771262293.1101255-1846-222791310183847/AnsiballZ_edpm_nftables_from_files.py'
Feb 16 17:18:13 compute-0 sudo[203387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:13 compute-0 podman[203348]: 2026-02-16 17:18:13.539631278 +0000 UTC m=+0.053695743 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:18:13 compute-0 python3[203395]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 16 17:18:13 compute-0 sudo[203387]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:14 compute-0 sudo[203545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cetdxdywsnyvchvysutiqttdyujscuvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262293.941421-1862-26661575814481/AnsiballZ_stat.py'
Feb 16 17:18:14 compute-0 sudo[203545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:14 compute-0 python3.9[203547]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:18:14 compute-0 sudo[203545]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:14 compute-0 sudo[203623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyexzgbkxezcszqnfflkkvbfsprwztej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262293.941421-1862-26661575814481/AnsiballZ_file.py'
Feb 16 17:18:14 compute-0 sudo[203623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:15 compute-0 python3.9[203625]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:15 compute-0 sudo[203623]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:15 compute-0 sudo[203775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypyyiizrgurdwjgrhhkcbgrzrxcxpatg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262295.212616-1886-134872680181187/AnsiballZ_stat.py'
Feb 16 17:18:15 compute-0 sudo[203775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:15 compute-0 python3.9[203777]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:18:15 compute-0 sudo[203775]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:15 compute-0 sudo[203853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewedicbgceqgryulljmhaembtjcfuqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262295.212616-1886-134872680181187/AnsiballZ_file.py'
Feb 16 17:18:15 compute-0 sudo[203853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:16 compute-0 python3.9[203855]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:16 compute-0 sudo[203853]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:16 compute-0 auditd[723]: Audit daemon rotating log files
Feb 16 17:18:16 compute-0 sudo[204005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkszaspqhpxelgqfnadqhvfajvrsfras ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262296.3507533-1910-125457674437432/AnsiballZ_stat.py'
Feb 16 17:18:16 compute-0 sudo[204005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:16 compute-0 python3.9[204007]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:18:16 compute-0 sudo[204005]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:17 compute-0 sudo[204083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umyosbhvmesgqpskcfzuoslbkkofpqvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262296.3507533-1910-125457674437432/AnsiballZ_file.py'
Feb 16 17:18:17 compute-0 sudo[204083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:17 compute-0 python3.9[204085]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:17 compute-0 sudo[204083]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:17 compute-0 sudo[204235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znkncfryqemuvbxflhprilettmmovxda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262297.4361432-1934-274312808183003/AnsiballZ_stat.py'
Feb 16 17:18:17 compute-0 sudo[204235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:17 compute-0 python3.9[204237]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:18:18 compute-0 sudo[204235]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:18 compute-0 sudo[204313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biwpqloehvwoihybyzvcnckjuoukcxua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262297.4361432-1934-274312808183003/AnsiballZ_file.py'
Feb 16 17:18:18 compute-0 sudo[204313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:18 compute-0 python3.9[204315]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:18 compute-0 sudo[204313]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:19 compute-0 sudo[204465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpbtoitttwsbxsdvlrgdimogzugollor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262298.8252292-1958-151186084926426/AnsiballZ_stat.py'
Feb 16 17:18:19 compute-0 sudo[204465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:19 compute-0 python3.9[204467]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 16 17:18:19 compute-0 sudo[204465]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:19 compute-0 sudo[204590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjphlspbwudhmbhodqooiocmkvcnghgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262298.8252292-1958-151186084926426/AnsiballZ_copy.py'
Feb 16 17:18:19 compute-0 sudo[204590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:19 compute-0 python3.9[204592]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771262298.8252292-1958-151186084926426/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:20 compute-0 sudo[204590]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:20 compute-0 podman[204593]: 2026-02-16 17:18:20.128852562 +0000 UTC m=+0.098172822 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 17:18:20 compute-0 sudo[204769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyobgkitatbxavppthwqdhonwgxycbvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262300.1828647-1988-79452079647217/AnsiballZ_file.py'
Feb 16 17:18:20 compute-0 sudo[204769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:20 compute-0 python3.9[204771]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:20 compute-0 sudo[204769]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:21 compute-0 sudo[204921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etgrfvccfkgcpxzpgwurqloxxkhwdwvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262300.8136747-2004-233800271585849/AnsiballZ_command.py'
Feb 16 17:18:21 compute-0 sudo[204921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:21 compute-0 python3.9[204923]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:18:21 compute-0 sudo[204921]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:21 compute-0 sudo[205090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnrjdezwpojimyxbefkdwfkmmhemtias ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262301.4391441-2020-170486067394453/AnsiballZ_blockinfile.py'
Feb 16 17:18:21 compute-0 sudo[205090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:21 compute-0 podman[205050]: 2026-02-16 17:18:21.875116703 +0000 UTC m=+0.083303462 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:18:22 compute-0 python3.9[205098]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:22 compute-0 sudo[205090]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:22 compute-0 sudo[205253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viouemoawxageuyaoybqdztalqjbekvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262302.2770061-2038-33305844046672/AnsiballZ_command.py'
Feb 16 17:18:22 compute-0 sudo[205253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:22 compute-0 python3.9[205255]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:18:22 compute-0 sudo[205253]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:23 compute-0 sudo[205406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykigoqbybfvrctuczioqcnmprhyknaqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262303.0942452-2054-98423973092704/AnsiballZ_stat.py'
Feb 16 17:18:23 compute-0 sudo[205406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:23 compute-0 python3.9[205408]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 16 17:18:23 compute-0 sudo[205406]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:24 compute-0 sudo[205560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zsdpympaluqrkfdfdedkniolanizanub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262303.7486603-2070-59707856838667/AnsiballZ_command.py'
Feb 16 17:18:24 compute-0 sudo[205560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:24 compute-0 python3.9[205562]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 16 17:18:24 compute-0 sudo[205560]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:24 compute-0 sudo[205715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dupwkuetkdwqqmkoqfgvsemglcrcyddd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771262304.4284694-2086-235777278238194/AnsiballZ_file.py'
Feb 16 17:18:24 compute-0 sudo[205715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:18:24 compute-0 python3.9[205717]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 16 17:18:24 compute-0 sudo[205715]: pam_unix(sudo:session): session closed for user root
Feb 16 17:18:25 compute-0 sshd-session[186480]: Connection closed by 192.168.122.30 port 54026
Feb 16 17:18:25 compute-0 sshd-session[186477]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:18:25 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Feb 16 17:18:25 compute-0 systemd[1]: session-26.scope: Consumed 1min 13.435s CPU time.
Feb 16 17:18:25 compute-0 systemd-logind[821]: Session 26 logged out. Waiting for processes to exit.
Feb 16 17:18:25 compute-0 systemd-logind[821]: Removed session 26.
Feb 16 17:18:26 compute-0 nova_compute[186176]: 2026-02-16 17:18:26.581 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:18:26 compute-0 nova_compute[186176]: 2026-02-16 17:18:26.612 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:18:26 compute-0 nova_compute[186176]: 2026-02-16 17:18:26.613 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:18:26 compute-0 nova_compute[186176]: 2026-02-16 17:18:26.613 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:18:26 compute-0 nova_compute[186176]: 2026-02-16 17:18:26.662 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:18:26 compute-0 nova_compute[186176]: 2026-02-16 17:18:26.662 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:18:26 compute-0 nova_compute[186176]: 2026-02-16 17:18:26.663 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:18:26 compute-0 nova_compute[186176]: 2026-02-16 17:18:26.663 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.348 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.349 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.563 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.565 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5993MB free_disk=73.26214218139648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.565 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.566 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.668 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.668 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.699 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.715 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.718 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:18:27 compute-0 nova_compute[186176]: 2026-02-16 17:18:27.719 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:18:28 compute-0 nova_compute[186176]: 2026-02-16 17:18:28.714 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:18:28 compute-0 nova_compute[186176]: 2026-02-16 17:18:28.715 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:18:28 compute-0 nova_compute[186176]: 2026-02-16 17:18:28.715 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:18:28 compute-0 nova_compute[186176]: 2026-02-16 17:18:28.716 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:18:29 compute-0 podman[195505]: time="2026-02-16T17:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:18:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:18:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2133 "" "Go-http-client/1.1"
Feb 16 17:18:31 compute-0 openstack_network_exporter[198360]: ERROR   17:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:18:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:18:31 compute-0 openstack_network_exporter[198360]: ERROR   17:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:18:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:18:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:18:38.143 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:18:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:18:38.144 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:18:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:18:38.145 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:18:40 compute-0 podman[205750]: 2026-02-16 17:18:40.090376648 +0000 UTC m=+0.058002800 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1770267347, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, vcs-type=git, version=9.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 16 17:18:44 compute-0 podman[205772]: 2026-02-16 17:18:44.094795419 +0000 UTC m=+0.065169804 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 16 17:18:51 compute-0 podman[205792]: 2026-02-16 17:18:51.158631192 +0000 UTC m=+0.123792138 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 17:18:52 compute-0 podman[205818]: 2026-02-16 17:18:52.091121668 +0000 UTC m=+0.054331150 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:18:59 compute-0 podman[195505]: time="2026-02-16T17:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:18:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:18:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2140 "" "Go-http-client/1.1"
Feb 16 17:19:01 compute-0 openstack_network_exporter[198360]: ERROR   17:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:19:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:19:01 compute-0 openstack_network_exporter[198360]: ERROR   17:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:19:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:19:11 compute-0 podman[205846]: 2026-02-16 17:19:11.090419983 +0000 UTC m=+0.059405203 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 17:19:15 compute-0 podman[205868]: 2026-02-16 17:19:15.11315154 +0000 UTC m=+0.070151845 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 16 17:19:22 compute-0 podman[205887]: 2026-02-16 17:19:22.099893609 +0000 UTC m=+0.076570051 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 16 17:19:22 compute-0 podman[205913]: 2026-02-16 17:19:22.207527773 +0000 UTC m=+0.075085424 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:19:26 compute-0 nova_compute[186176]: 2026-02-16 17:19:26.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:19:26 compute-0 nova_compute[186176]: 2026-02-16 17:19:26.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:19:26 compute-0 nova_compute[186176]: 2026-02-16 17:19:26.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:19:26 compute-0 nova_compute[186176]: 2026-02-16 17:19:26.426 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:19:26 compute-0 nova_compute[186176]: 2026-02-16 17:19:26.427 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:19:26 compute-0 nova_compute[186176]: 2026-02-16 17:19:26.427 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.353 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.354 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.354 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.355 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.536 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.537 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6139MB free_disk=73.26283264160156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.537 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.537 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.618 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.618 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.645 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.662 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.664 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:19:27 compute-0 nova_compute[186176]: 2026-02-16 17:19:27.664 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:19:28 compute-0 nova_compute[186176]: 2026-02-16 17:19:28.660 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:19:28 compute-0 nova_compute[186176]: 2026-02-16 17:19:28.661 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:19:28 compute-0 nova_compute[186176]: 2026-02-16 17:19:28.661 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:19:28 compute-0 nova_compute[186176]: 2026-02-16 17:19:28.661 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:19:29 compute-0 nova_compute[186176]: 2026-02-16 17:19:29.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:19:30 compute-0 nova_compute[186176]: 2026-02-16 17:19:30.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:19:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:19:38.143 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:19:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:19:38.144 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:19:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:19:38.144 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:19:42 compute-0 podman[205937]: 2026-02-16 17:19:42.083984295 +0000 UTC m=+0.057627956 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, release=1770267347, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 16 17:19:46 compute-0 podman[205959]: 2026-02-16 17:19:46.123446882 +0000 UTC m=+0.081870917 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:19:50 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:19:50.338 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:19:50 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:19:50.340 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:19:50 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:19:50.342 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:19:53 compute-0 podman[205979]: 2026-02-16 17:19:53.097600035 +0000 UTC m=+0.062752705 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:19:53 compute-0 podman[205978]: 2026-02-16 17:19:53.115924527 +0000 UTC m=+0.084552785 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Feb 16 17:19:59 compute-0 podman[195505]: time="2026-02-16T17:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:19:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:19:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2148 "" "Go-http-client/1.1"
Feb 16 17:20:01 compute-0 openstack_network_exporter[198360]: ERROR   17:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:20:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:20:01 compute-0 openstack_network_exporter[198360]: ERROR   17:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:20:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:20:13 compute-0 podman[206026]: 2026-02-16 17:20:13.099690535 +0000 UTC m=+0.070237413 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1770267347, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Feb 16 17:20:17 compute-0 podman[206047]: 2026-02-16 17:20:17.099267756 +0000 UTC m=+0.066325865 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 17:20:24 compute-0 podman[206068]: 2026-02-16 17:20:24.148819169 +0000 UTC m=+0.110414686 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 17:20:24 compute-0 podman[206067]: 2026-02-16 17:20:24.16389421 +0000 UTC m=+0.131378945 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller)
Feb 16 17:20:27 compute-0 nova_compute[186176]: 2026-02-16 17:20:27.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:20:27 compute-0 nova_compute[186176]: 2026-02-16 17:20:27.889 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:20:27 compute-0 nova_compute[186176]: 2026-02-16 17:20:27.890 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:20:27 compute-0 nova_compute[186176]: 2026-02-16 17:20:27.890 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.006 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.510 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.510 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.511 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.511 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.688 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.689 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6180MB free_disk=73.26281356811523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.690 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.690 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.783 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.784 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.850 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.877 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.879 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:20:28 compute-0 nova_compute[186176]: 2026-02-16 17:20:28.879 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:20:29 compute-0 podman[195505]: time="2026-02-16T17:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:20:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:20:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2146 "" "Go-http-client/1.1"
Feb 16 17:20:29 compute-0 nova_compute[186176]: 2026-02-16 17:20:29.879 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:20:30 compute-0 nova_compute[186176]: 2026-02-16 17:20:30.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:20:30 compute-0 nova_compute[186176]: 2026-02-16 17:20:30.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:20:31 compute-0 nova_compute[186176]: 2026-02-16 17:20:31.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:20:31 compute-0 openstack_network_exporter[198360]: ERROR   17:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:20:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:20:31 compute-0 openstack_network_exporter[198360]: ERROR   17:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:20:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:20:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:20:38.145 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:20:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:20:38.146 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:20:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:20:38.146 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:20:44 compute-0 podman[206118]: 2026-02-16 17:20:44.116618339 +0000 UTC m=+0.090136933 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, release=1770267347, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7)
Feb 16 17:20:48 compute-0 podman[206141]: 2026-02-16 17:20:48.101478302 +0000 UTC m=+0.070020075 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 16 17:20:55 compute-0 podman[206163]: 2026-02-16 17:20:55.118274375 +0000 UTC m=+0.077599649 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:20:55 compute-0 podman[206162]: 2026-02-16 17:20:55.135441187 +0000 UTC m=+0.101816633 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Feb 16 17:20:59 compute-0 podman[195505]: time="2026-02-16T17:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:20:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:20:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2148 "" "Go-http-client/1.1"
Feb 16 17:21:01 compute-0 openstack_network_exporter[198360]: ERROR   17:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:21:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:21:01 compute-0 openstack_network_exporter[198360]: ERROR   17:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:21:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:21:15 compute-0 podman[206211]: 2026-02-16 17:21:15.129507676 +0000 UTC m=+0.100947050 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-type=git, container_name=openstack_network_exporter)
Feb 16 17:21:19 compute-0 podman[206234]: 2026-02-16 17:21:19.085153828 +0000 UTC m=+0.058561009 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Feb 16 17:21:26 compute-0 podman[206255]: 2026-02-16 17:21:26.105970762 +0000 UTC m=+0.071620055 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:21:26 compute-0 podman[206254]: 2026-02-16 17:21:26.113553628 +0000 UTC m=+0.087240948 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:21:26 compute-0 nova_compute[186176]: 2026-02-16 17:21:26.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:26 compute-0 nova_compute[186176]: 2026-02-16 17:21:26.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 17:21:26 compute-0 nova_compute[186176]: 2026-02-16 17:21:26.336 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 17:21:26 compute-0 nova_compute[186176]: 2026-02-16 17:21:26.337 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:26 compute-0 nova_compute[186176]: 2026-02-16 17:21:26.337 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 17:21:26 compute-0 nova_compute[186176]: 2026-02-16 17:21:26.351 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:28 compute-0 nova_compute[186176]: 2026-02-16 17:21:28.366 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:28 compute-0 nova_compute[186176]: 2026-02-16 17:21:28.367 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:21:28 compute-0 nova_compute[186176]: 2026-02-16 17:21:28.367 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:21:28 compute-0 nova_compute[186176]: 2026-02-16 17:21:28.420 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:21:28 compute-0 nova_compute[186176]: 2026-02-16 17:21:28.421 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:28 compute-0 nova_compute[186176]: 2026-02-16 17:21:28.422 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:21:29 compute-0 nova_compute[186176]: 2026-02-16 17:21:29.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:29 compute-0 podman[195505]: time="2026-02-16T17:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:21:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:21:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2149 "" "Go-http-client/1.1"
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.343 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.344 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.344 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.345 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.539 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.540 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6185MB free_disk=73.26285552978516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.540 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.541 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.691 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.692 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.741 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.788 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.788 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.802 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.824 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.848 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.861 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.864 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:21:30 compute-0 nova_compute[186176]: 2026-02-16 17:21:30.865 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:21:31 compute-0 openstack_network_exporter[198360]: ERROR   17:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:21:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:21:31 compute-0 openstack_network_exporter[198360]: ERROR   17:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:21:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:21:31 compute-0 nova_compute[186176]: 2026-02-16 17:21:31.866 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:32 compute-0 nova_compute[186176]: 2026-02-16 17:21:32.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:32 compute-0 nova_compute[186176]: 2026-02-16 17:21:32.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:33 compute-0 nova_compute[186176]: 2026-02-16 17:21:33.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:21:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:21:38.147 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:21:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:21:38.148 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:21:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:21:38.148 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:21:46 compute-0 podman[206305]: 2026-02-16 17:21:46.116290478 +0000 UTC m=+0.081719949 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git)
Feb 16 17:21:50 compute-0 podman[206326]: 2026-02-16 17:21:50.09544737 +0000 UTC m=+0.067802062 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:21:57 compute-0 podman[206347]: 2026-02-16 17:21:57.102025071 +0000 UTC m=+0.068135450 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:21:57 compute-0 podman[206346]: 2026-02-16 17:21:57.127832405 +0000 UTC m=+0.104189199 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:21:59 compute-0 podman[195505]: time="2026-02-16T17:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:21:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:21:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2152 "" "Go-http-client/1.1"
Feb 16 17:22:01 compute-0 openstack_network_exporter[198360]: ERROR   17:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:22:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:22:01 compute-0 openstack_network_exporter[198360]: ERROR   17:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:22:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:22:07 compute-0 sshd-session[206395]: Connection closed by 177.197.70.223 port 56598
Feb 16 17:22:09 compute-0 sshd-session[206396]: Invalid user a from 177.197.70.223 port 56602
Feb 16 17:22:09 compute-0 sshd-session[206396]: Connection closed by invalid user a 177.197.70.223 port 56602 [preauth]
Feb 16 17:22:17 compute-0 podman[206398]: 2026-02-16 17:22:17.089996988 +0000 UTC m=+0.060891379 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Feb 16 17:22:21 compute-0 podman[206419]: 2026-02-16 17:22:21.076454631 +0000 UTC m=+0.046278775 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:22:27 compute-0 nova_compute[186176]: 2026-02-16 17:22:27.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:22:28 compute-0 podman[206440]: 2026-02-16 17:22:28.093842142 +0000 UTC m=+0.056942200 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:22:28 compute-0 podman[206439]: 2026-02-16 17:22:28.11345296 +0000 UTC m=+0.087065811 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 16 17:22:28 compute-0 nova_compute[186176]: 2026-02-16 17:22:28.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:22:28 compute-0 nova_compute[186176]: 2026-02-16 17:22:28.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:22:29 compute-0 nova_compute[186176]: 2026-02-16 17:22:29.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:22:29 compute-0 nova_compute[186176]: 2026-02-16 17:22:29.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:22:29 compute-0 nova_compute[186176]: 2026-02-16 17:22:29.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:22:29 compute-0 nova_compute[186176]: 2026-02-16 17:22:29.338 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:22:29 compute-0 nova_compute[186176]: 2026-02-16 17:22:29.338 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:22:29 compute-0 podman[195505]: time="2026-02-16T17:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:22:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:22:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2153 "" "Go-http-client/1.1"
Feb 16 17:22:30 compute-0 nova_compute[186176]: 2026-02-16 17:22:30.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:22:30 compute-0 nova_compute[186176]: 2026-02-16 17:22:30.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:22:31 compute-0 openstack_network_exporter[198360]: ERROR   17:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:22:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:22:31 compute-0 openstack_network_exporter[198360]: ERROR   17:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:22:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:22:31 compute-0 nova_compute[186176]: 2026-02-16 17:22:31.805 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:22:31 compute-0 nova_compute[186176]: 2026-02-16 17:22:31.806 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:22:31 compute-0 nova_compute[186176]: 2026-02-16 17:22:31.807 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:22:31 compute-0 nova_compute[186176]: 2026-02-16 17:22:31.807 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:22:31 compute-0 nova_compute[186176]: 2026-02-16 17:22:31.966 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:22:31 compute-0 nova_compute[186176]: 2026-02-16 17:22:31.967 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6187MB free_disk=73.26285171508789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:22:31 compute-0 nova_compute[186176]: 2026-02-16 17:22:31.968 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:22:31 compute-0 nova_compute[186176]: 2026-02-16 17:22:31.968 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:22:32 compute-0 nova_compute[186176]: 2026-02-16 17:22:32.100 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:22:32 compute-0 nova_compute[186176]: 2026-02-16 17:22:32.101 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:22:32 compute-0 nova_compute[186176]: 2026-02-16 17:22:32.127 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:22:32 compute-0 nova_compute[186176]: 2026-02-16 17:22:32.145 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:22:32 compute-0 nova_compute[186176]: 2026-02-16 17:22:32.146 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:22:32 compute-0 nova_compute[186176]: 2026-02-16 17:22:32.146 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:22:34 compute-0 nova_compute[186176]: 2026-02-16 17:22:34.147 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:22:34 compute-0 nova_compute[186176]: 2026-02-16 17:22:34.148 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:22:34 compute-0 nova_compute[186176]: 2026-02-16 17:22:34.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:22:34 compute-0 nova_compute[186176]: 2026-02-16 17:22:34.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:22:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:22:38.149 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:22:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:22:38.150 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:22:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:22:38.150 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:22:48 compute-0 podman[206490]: 2026-02-16 17:22:48.112137826 +0000 UTC m=+0.077582970 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z)
Feb 16 17:22:52 compute-0 podman[206512]: 2026-02-16 17:22:52.09758062 +0000 UTC m=+0.061959788 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Feb 16 17:22:59 compute-0 podman[206532]: 2026-02-16 17:22:59.098622428 +0000 UTC m=+0.068239122 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:22:59 compute-0 podman[206531]: 2026-02-16 17:22:59.14488685 +0000 UTC m=+0.115599411 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 17:22:59 compute-0 podman[195505]: time="2026-02-16T17:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:22:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:22:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2154 "" "Go-http-client/1.1"
Feb 16 17:23:01 compute-0 openstack_network_exporter[198360]: ERROR   17:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:23:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:23:01 compute-0 openstack_network_exporter[198360]: ERROR   17:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:23:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:23:19 compute-0 podman[206582]: 2026-02-16 17:23:19.08197346 +0000 UTC m=+0.055117720 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public)
Feb 16 17:23:23 compute-0 podman[206603]: 2026-02-16 17:23:23.104779899 +0000 UTC m=+0.065791832 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 17:23:24 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:23:24.235 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:23:24 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:23:24.237 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:23:28 compute-0 nova_compute[186176]: 2026-02-16 17:23:28.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:23:28 compute-0 nova_compute[186176]: 2026-02-16 17:23:28.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:23:29 compute-0 podman[195505]: time="2026-02-16T17:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:23:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:23:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2150 "" "Go-http-client/1.1"
Feb 16 17:23:30 compute-0 podman[206623]: 2026-02-16 17:23:30.108246197 +0000 UTC m=+0.066501008 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:23:30 compute-0 podman[206622]: 2026-02-16 17:23:30.130359019 +0000 UTC m=+0.093643073 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 17:23:30 compute-0 nova_compute[186176]: 2026-02-16 17:23:30.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:23:30 compute-0 nova_compute[186176]: 2026-02-16 17:23:30.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:23:30 compute-0 nova_compute[186176]: 2026-02-16 17:23:30.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:23:30 compute-0 nova_compute[186176]: 2026-02-16 17:23:30.333 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.348 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.348 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.349 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:23:31 compute-0 openstack_network_exporter[198360]: ERROR   17:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:23:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:23:31 compute-0 openstack_network_exporter[198360]: ERROR   17:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:23:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.539 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.541 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6176MB free_disk=73.26285171508789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.541 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.541 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.599 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.599 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.617 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.632 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.634 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:23:31 compute-0 nova_compute[186176]: 2026-02-16 17:23:31.635 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:23:32 compute-0 nova_compute[186176]: 2026-02-16 17:23:32.630 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:23:33 compute-0 nova_compute[186176]: 2026-02-16 17:23:33.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:23:34 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:23:34.240 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:23:34 compute-0 nova_compute[186176]: 2026-02-16 17:23:34.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:23:35 compute-0 nova_compute[186176]: 2026-02-16 17:23:35.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:23:36 compute-0 nova_compute[186176]: 2026-02-16 17:23:36.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:23:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:23:38.149 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:23:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:23:38.151 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:23:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:23:38.151 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:23:50 compute-0 podman[206673]: 2026-02-16 17:23:50.099770092 +0000 UTC m=+0.067108144 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2026-02-05T04:57:10Z, release=1770267347, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 17:23:54 compute-0 podman[206696]: 2026-02-16 17:23:54.089976651 +0000 UTC m=+0.061113377 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.030 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "970cfdb5-b102-447c-9730-04f44a0117c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.030 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.063 186180 DEBUG nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.204 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.205 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.220 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.221 186180 INFO nova.compute.claims [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.352 186180 DEBUG nova.compute.provider_tree [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.369 186180 DEBUG nova.scheduler.client.report [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.393 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.395 186180 DEBUG nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.440 186180 DEBUG nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.441 186180 DEBUG nova.network.neutron [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.469 186180 INFO nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.492 186180 DEBUG nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.605 186180 DEBUG nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.608 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.609 186180 INFO nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Creating image(s)
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.610 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.611 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.612 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.613 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:23:57 compute-0 nova_compute[186176]: 2026-02-16 17:23:57.614 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:23:58 compute-0 nova_compute[186176]: 2026-02-16 17:23:58.109 186180 WARNING oslo_policy.policy [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 16 17:23:58 compute-0 nova_compute[186176]: 2026-02-16 17:23:58.109 186180 WARNING oslo_policy.policy [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 16 17:23:58 compute-0 nova_compute[186176]: 2026-02-16 17:23:58.113 186180 DEBUG nova.policy [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '04e81d9e145a466bbabfe4fdaf9f09aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:23:59 compute-0 nova_compute[186176]: 2026-02-16 17:23:59.339 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:23:59 compute-0 nova_compute[186176]: 2026-02-16 17:23:59.403 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646.part --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:23:59 compute-0 nova_compute[186176]: 2026-02-16 17:23:59.404 186180 DEBUG nova.virt.images [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] 7a81518d-a287-4a96-937c-188ae866c5b8 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Feb 16 17:23:59 compute-0 nova_compute[186176]: 2026-02-16 17:23:59.407 186180 DEBUG nova.privsep.utils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 16 17:23:59 compute-0 nova_compute[186176]: 2026-02-16 17:23:59.407 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646.part /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:23:59 compute-0 nova_compute[186176]: 2026-02-16 17:23:59.570 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646.part /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646.converted" returned: 0 in 0.163s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:23:59 compute-0 nova_compute[186176]: 2026-02-16 17:23:59.573 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:23:59 compute-0 nova_compute[186176]: 2026-02-16 17:23:59.629 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646.converted --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:23:59 compute-0 nova_compute[186176]: 2026-02-16 17:23:59.632 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:23:59 compute-0 nova_compute[186176]: 2026-02-16 17:23:59.659 186180 INFO oslo.privsep.daemon [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpiu6tn7h9/privsep.sock']
Feb 16 17:23:59 compute-0 podman[195505]: time="2026-02-16T17:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:23:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:23:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2157 "" "Go-http-client/1.1"
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.202 186180 DEBUG nova.network.neutron [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Successfully created port: 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.390 186180 INFO oslo.privsep.daemon [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Spawned new privsep daemon via rootwrap
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.229 206738 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.235 206738 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.238 206738 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.239 206738 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206738
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.465 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.514 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.515 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.516 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.529 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.581 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.582 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.614 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.615 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.616 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.691 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.692 186180 DEBUG nova.virt.disk.api [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Checking if we can resize image /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.693 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.737 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.738 186180 DEBUG nova.virt.disk.api [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Cannot resize image /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.739 186180 DEBUG nova.objects.instance [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lazy-loading 'migration_context' on Instance uuid 970cfdb5-b102-447c-9730-04f44a0117c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.756 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.757 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Ensure instance console log exists: /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.757 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.758 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:00 compute-0 nova_compute[186176]: 2026-02-16 17:24:00.759 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:01 compute-0 podman[206756]: 2026-02-16 17:24:01.122220462 +0000 UTC m=+0.077700813 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:24:01 compute-0 podman[206755]: 2026-02-16 17:24:01.146302691 +0000 UTC m=+0.104398606 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:24:01 compute-0 openstack_network_exporter[198360]: ERROR   17:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:24:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:24:01 compute-0 openstack_network_exporter[198360]: ERROR   17:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:24:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:24:01 compute-0 nova_compute[186176]: 2026-02-16 17:24:01.526 186180 DEBUG nova.network.neutron [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Successfully updated port: 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:24:01 compute-0 nova_compute[186176]: 2026-02-16 17:24:01.543 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:24:01 compute-0 nova_compute[186176]: 2026-02-16 17:24:01.543 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquired lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:24:01 compute-0 nova_compute[186176]: 2026-02-16 17:24:01.543 186180 DEBUG nova.network.neutron [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:24:01 compute-0 nova_compute[186176]: 2026-02-16 17:24:01.910 186180 DEBUG nova.network.neutron [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:24:02 compute-0 nova_compute[186176]: 2026-02-16 17:24:01.999 186180 DEBUG nova.compute.manager [req-0c69c7c2-d655-4e60-b618-863730151b83 req-36a046c1-674c-40e5-b6cb-5daa599acf4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received event network-changed-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:24:02 compute-0 nova_compute[186176]: 2026-02-16 17:24:02.000 186180 DEBUG nova.compute.manager [req-0c69c7c2-d655-4e60-b618-863730151b83 req-36a046c1-674c-40e5-b6cb-5daa599acf4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Refreshing instance network info cache due to event network-changed-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:24:02 compute-0 nova_compute[186176]: 2026-02-16 17:24:02.000 186180 DEBUG oslo_concurrency.lockutils [req-0c69c7c2-d655-4e60-b618-863730151b83 req-36a046c1-674c-40e5-b6cb-5daa599acf4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.075 186180 DEBUG nova.network.neutron [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Updating instance_info_cache with network_info: [{"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.099 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Releasing lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.100 186180 DEBUG nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Instance network_info: |[{"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.101 186180 DEBUG oslo_concurrency.lockutils [req-0c69c7c2-d655-4e60-b618-863730151b83 req-36a046c1-674c-40e5-b6cb-5daa599acf4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.101 186180 DEBUG nova.network.neutron [req-0c69c7c2-d655-4e60-b618-863730151b83 req-36a046c1-674c-40e5-b6cb-5daa599acf4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Refreshing network info cache for port 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.105 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Start _get_guest_xml network_info=[{"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.112 186180 WARNING nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.119 186180 DEBUG nova.virt.libvirt.host [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.120 186180 DEBUG nova.virt.libvirt.host [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.131 186180 DEBUG nova.virt.libvirt.host [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.132 186180 DEBUG nova.virt.libvirt.host [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.134 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.134 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.135 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.136 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.136 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.137 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.137 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.138 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.138 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.138 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.139 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.139 186180 DEBUG nova.virt.hardware [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.146 186180 DEBUG nova.privsep.utils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.148 186180 DEBUG nova.virt.libvirt.vif [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:23:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-168706571',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-168706571',id=1,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-ncq3jj3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:23:57Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=970cfdb5-b102-447c-9730-04f44a0117c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.148 186180 DEBUG nova.network.os_vif_util [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converting VIF {"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.150 186180 DEBUG nova.network.os_vif_util [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:77:a3,bridge_name='br-int',has_traffic_filtering=True,id=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45f5a5dc-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.153 186180 DEBUG nova.objects.instance [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lazy-loading 'pci_devices' on Instance uuid 970cfdb5-b102-447c-9730-04f44a0117c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.179 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <uuid>970cfdb5-b102-447c-9730-04f44a0117c2</uuid>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <name>instance-00000001</name>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-168706571</nova:name>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:24:03</nova:creationTime>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:24:03 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:24:03 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:24:03 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:24:03 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:24:03 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:24:03 compute-0 nova_compute[186176]:         <nova:user uuid="04e81d9e145a466bbabfe4fdaf9f09aa">tempest-TestExecuteActionsViaActuator-900316824-project-member</nova:user>
Feb 16 17:24:03 compute-0 nova_compute[186176]:         <nova:project uuid="97a4c97daa7a495f91b4f65a132f7c0f">tempest-TestExecuteActionsViaActuator-900316824</nova:project>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:24:03 compute-0 nova_compute[186176]:         <nova:port uuid="45f5a5dc-c3f7-400d-80fb-aac0de24ed2c">
Feb 16 17:24:03 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <system>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <entry name="serial">970cfdb5-b102-447c-9730-04f44a0117c2</entry>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <entry name="uuid">970cfdb5-b102-447c-9730-04f44a0117c2</entry>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     </system>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <os>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   </os>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <features>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   </features>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.config"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:4a:77:a3"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <target dev="tap45f5a5dc-c3"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/console.log" append="off"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <video>
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     </video>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:24:03 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:24:03 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:24:03 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:24:03 compute-0 nova_compute[186176]: </domain>
Feb 16 17:24:03 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.180 186180 DEBUG nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Preparing to wait for external event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.181 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.181 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.182 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.183 186180 DEBUG nova.virt.libvirt.vif [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:23:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-168706571',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-168706571',id=1,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-ncq3jj3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:23:57Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=970cfdb5-b102-447c-9730-04f44a0117c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.184 186180 DEBUG nova.network.os_vif_util [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converting VIF {"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.185 186180 DEBUG nova.network.os_vif_util [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4a:77:a3,bridge_name='br-int',has_traffic_filtering=True,id=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45f5a5dc-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.186 186180 DEBUG os_vif [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:77:a3,bridge_name='br-int',has_traffic_filtering=True,id=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45f5a5dc-c3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.293 186180 DEBUG ovsdbapp.backend.ovs_idl [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.294 186180 DEBUG ovsdbapp.backend.ovs_idl [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.296 186180 DEBUG ovsdbapp.backend.ovs_idl [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.296 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.297 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.297 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.298 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.299 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.301 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.313 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.314 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.314 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.315 186180 INFO oslo.privsep.daemon [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpmvxn_nm9/privsep.sock']
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.962 186180 INFO oslo.privsep.daemon [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Spawned new privsep daemon via rootwrap
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.848 206807 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.853 206807 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.857 206807 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 16 17:24:03 compute-0 nova_compute[186176]: 2026-02-16 17:24:03.857 206807 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206807
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.251 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.252 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap45f5a5dc-c3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.253 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap45f5a5dc-c3, col_values=(('external_ids', {'iface-id': '45f5a5dc-c3f7-400d-80fb-aac0de24ed2c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4a:77:a3', 'vm-uuid': '970cfdb5-b102-447c-9730-04f44a0117c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.256 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:04 compute-0 NetworkManager[56463]: <info>  [1771262644.2582] manager: (tap45f5a5dc-c3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.260 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.266 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.268 186180 INFO os_vif [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4a:77:a3,bridge_name='br-int',has_traffic_filtering=True,id=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45f5a5dc-c3')
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.326 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.327 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.327 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] No VIF found with MAC fa:16:3e:4a:77:a3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.328 186180 INFO nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Using config drive
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.419 186180 DEBUG nova.network.neutron [req-0c69c7c2-d655-4e60-b618-863730151b83 req-36a046c1-674c-40e5-b6cb-5daa599acf4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Updated VIF entry in instance network info cache for port 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.420 186180 DEBUG nova.network.neutron [req-0c69c7c2-d655-4e60-b618-863730151b83 req-36a046c1-674c-40e5-b6cb-5daa599acf4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Updating instance_info_cache with network_info: [{"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.438 186180 DEBUG oslo_concurrency.lockutils [req-0c69c7c2-d655-4e60-b618-863730151b83 req-36a046c1-674c-40e5-b6cb-5daa599acf4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.932 186180 INFO nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Creating config drive at /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.config
Feb 16 17:24:04 compute-0 nova_compute[186176]: 2026-02-16 17:24:04.936 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcdnp_6s2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.054 186180 DEBUG oslo_concurrency.processutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcdnp_6s2" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:05 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 16 17:24:05 compute-0 kernel: tap45f5a5dc-c3: entered promiscuous mode
Feb 16 17:24:05 compute-0 NetworkManager[56463]: <info>  [1771262645.1305] manager: (tap45f5a5dc-c3): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Feb 16 17:24:05 compute-0 ovn_controller[96437]: 2026-02-16T17:24:05Z|00027|binding|INFO|Claiming lport 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c for this chassis.
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.132 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:05 compute-0 ovn_controller[96437]: 2026-02-16T17:24:05Z|00028|binding|INFO|45f5a5dc-c3f7-400d-80fb-aac0de24ed2c: Claiming fa:16:3e:4a:77:a3 10.100.0.7
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.154 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:77:a3 10.100.0.7'], port_security=['fa:16:3e:4a:77:a3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '970cfdb5-b102-447c-9730-04f44a0117c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a09b650a-b8da-4ec6-af84-f46bd29af7dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18cf134a-5b0b-4046-bb3d-fdfa0b081c31, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.156 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c in datapath 50b90e9d-0874-4370-ad17-1fff2c4cce15 bound to our chassis
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.159 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.160 105730 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpg5fs2heu/privsep.sock']
Feb 16 17:24:05 compute-0 systemd-udevd[206831]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:24:05 compute-0 NetworkManager[56463]: <info>  [1771262645.1814] device (tap45f5a5dc-c3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:24:05 compute-0 NetworkManager[56463]: <info>  [1771262645.1819] device (tap45f5a5dc-c3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.185 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:05 compute-0 ovn_controller[96437]: 2026-02-16T17:24:05Z|00029|binding|INFO|Setting lport 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c ovn-installed in OVS
Feb 16 17:24:05 compute-0 ovn_controller[96437]: 2026-02-16T17:24:05Z|00030|binding|INFO|Setting lport 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c up in Southbound
Feb 16 17:24:05 compute-0 systemd-machined[155631]: New machine qemu-1-instance-00000001.
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.195 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:05 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.384 186180 DEBUG nova.compute.manager [req-88b0b6e9-4320-4909-b3d2-ce011263f434 req-3d52016a-4a58-432f-aa58-401fd229bdb6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.385 186180 DEBUG oslo_concurrency.lockutils [req-88b0b6e9-4320-4909-b3d2-ce011263f434 req-3d52016a-4a58-432f-aa58-401fd229bdb6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.386 186180 DEBUG oslo_concurrency.lockutils [req-88b0b6e9-4320-4909-b3d2-ce011263f434 req-3d52016a-4a58-432f-aa58-401fd229bdb6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.386 186180 DEBUG oslo_concurrency.lockutils [req-88b0b6e9-4320-4909-b3d2-ce011263f434 req-3d52016a-4a58-432f-aa58-401fd229bdb6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.387 186180 DEBUG nova.compute.manager [req-88b0b6e9-4320-4909-b3d2-ce011263f434 req-3d52016a-4a58-432f-aa58-401fd229bdb6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Processing event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.530 186180 DEBUG nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.532 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262645.5297453, 970cfdb5-b102-447c-9730-04f44a0117c2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.532 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] VM Started (Lifecycle Event)
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.536 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.543 186180 INFO nova.virt.libvirt.driver [-] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Instance spawned successfully.
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.544 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.591 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.600 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.606 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.606 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.607 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.608 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.609 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.610 186180 DEBUG nova.virt.libvirt.driver [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.624 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.625 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262645.5310225, 970cfdb5-b102-447c-9730-04f44a0117c2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.625 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] VM Paused (Lifecycle Event)
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.648 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.653 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262645.5336397, 970cfdb5-b102-447c-9730-04f44a0117c2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.653 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] VM Resumed (Lifecycle Event)
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.673 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.676 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.685 186180 INFO nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Took 8.08 seconds to spawn the instance on the hypervisor.
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.686 186180 DEBUG nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.696 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.753 186180 INFO nova.compute.manager [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Took 8.59 seconds to build instance.
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.777 186180 DEBUG oslo_concurrency.lockutils [None req-940e062c-03c6-45f9-8361-1769d04d3a59 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.786 105730 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.787 105730 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpg5fs2heu/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.679 206858 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.684 206858 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.686 206858 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.686 206858 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206858
Feb 16 17:24:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:05.794 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5cc315-c325-47b4-944b-57b1aca02cde]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:05 compute-0 nova_compute[186176]: 2026-02-16 17:24:05.970 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.282 206858 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.282 206858 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.282 206858 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.784 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[345b9576-f570-4e11-85e6-55dc81c368dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.785 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap50b90e9d-01 in ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.787 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap50b90e9d-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.787 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[27efc253-c484-4b91-87a2-97061ca82dd9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.790 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[acda9416-9e1f-4005-aa15-0be7fa366f16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.811 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[001d3c72-a6e5-4a57-ba3f-375ded76a6e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.834 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[183f937d-4a85-4743-b467-3dd71dc4d215]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:06 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:06.836 105730 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp79hfq99i/privsep.sock']
Feb 16 17:24:07 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:07.457 105730 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 16 17:24:07 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:07.458 105730 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp79hfq99i/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 16 17:24:07 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:07.344 206872 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 16 17:24:07 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:07.350 206872 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 16 17:24:07 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:07.354 206872 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 16 17:24:07 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:07.354 206872 INFO oslo.privsep.daemon [-] privsep daemon running as pid 206872
Feb 16 17:24:07 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:07.460 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa14edc-f424-45b8-9ebd-f2d4e0a2d279]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:07 compute-0 nova_compute[186176]: 2026-02-16 17:24:07.478 186180 DEBUG nova.compute.manager [req-18aafc98-89ca-4f76-99c8-3cd4094ce3c8 req-22b0a426-2853-4a7f-b27d-b9a96c642f08 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:24:07 compute-0 nova_compute[186176]: 2026-02-16 17:24:07.479 186180 DEBUG oslo_concurrency.lockutils [req-18aafc98-89ca-4f76-99c8-3cd4094ce3c8 req-22b0a426-2853-4a7f-b27d-b9a96c642f08 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:07 compute-0 nova_compute[186176]: 2026-02-16 17:24:07.479 186180 DEBUG oslo_concurrency.lockutils [req-18aafc98-89ca-4f76-99c8-3cd4094ce3c8 req-22b0a426-2853-4a7f-b27d-b9a96c642f08 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:07 compute-0 nova_compute[186176]: 2026-02-16 17:24:07.480 186180 DEBUG oslo_concurrency.lockutils [req-18aafc98-89ca-4f76-99c8-3cd4094ce3c8 req-22b0a426-2853-4a7f-b27d-b9a96c642f08 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:07 compute-0 nova_compute[186176]: 2026-02-16 17:24:07.480 186180 DEBUG nova.compute.manager [req-18aafc98-89ca-4f76-99c8-3cd4094ce3c8 req-22b0a426-2853-4a7f-b27d-b9a96c642f08 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] No waiting events found dispatching network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:24:07 compute-0 nova_compute[186176]: 2026-02-16 17:24:07.481 186180 WARNING nova.compute.manager [req-18aafc98-89ca-4f76-99c8-3cd4094ce3c8 req-22b0a426-2853-4a7f-b27d-b9a96c642f08 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received unexpected event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c for instance with vm_state active and task_state None.
Feb 16 17:24:07 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:07.920 206872 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:07 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:07.921 206872 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:07 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:07.921 206872 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.484 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[678cfc68-ce7d-4f3c-8ffa-33d75c41ecc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 NetworkManager[56463]: <info>  [1771262648.5064] manager: (tap50b90e9d-00): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.505 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b6e926-9c5a-4466-907d-09733018e2ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.530 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[f8d07ac4-3af8-4c6e-88c9-04560e0c909e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.533 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[7cd37a54-1b05-4da9-9b34-d85e86382652]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 systemd-udevd[206884]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:24:08 compute-0 NetworkManager[56463]: <info>  [1771262648.5510] device (tap50b90e9d-00): carrier: link connected
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.553 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[01b43603-e50d-41df-9855-4ac459691fd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.578 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[d68ca9f5-3190-4848-b0dd-4b227119187c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b90e9d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:d8:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424407, 'reachable_time': 17402, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 206898, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.596 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[a67acb5b-f235-4225-bff3-b9972e8cc133]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8c:d889'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424407, 'tstamp': 424407}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 206903, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.615 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[887f61c6-f963-4db0-af26-1ede2c867cb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b90e9d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:d8:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424407, 'reachable_time': 17402, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 206904, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.649 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[949595b4-1b9b-485b-b71f-6f59006f7d27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.715 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[d4127d18-664f-496e-91f2-6330f5767222]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.717 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b90e9d-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.718 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.719 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b90e9d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:24:08 compute-0 nova_compute[186176]: 2026-02-16 17:24:08.721 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:08 compute-0 NetworkManager[56463]: <info>  [1771262648.7226] manager: (tap50b90e9d-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Feb 16 17:24:08 compute-0 kernel: tap50b90e9d-00: entered promiscuous mode
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.726 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap50b90e9d-00, col_values=(('external_ids', {'iface-id': '7e8ec4b7-6252-49aa-a342-59a2b0f3de95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:24:08 compute-0 nova_compute[186176]: 2026-02-16 17:24:08.727 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:08 compute-0 ovn_controller[96437]: 2026-02-16T17:24:08Z|00031|binding|INFO|Releasing lport 7e8ec4b7-6252-49aa-a342-59a2b0f3de95 from this chassis (sb_readonly=0)
Feb 16 17:24:08 compute-0 nova_compute[186176]: 2026-02-16 17:24:08.734 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.737 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/50b90e9d-0874-4370-ad17-1fff2c4cce15.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/50b90e9d-0874-4370-ad17-1fff2c4cce15.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.738 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[a452b603-909d-49c5-bea0-e17c5cf34e89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.740 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/50b90e9d-0874-4370-ad17-1fff2c4cce15.pid.haproxy
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:24:08 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:08.741 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'env', 'PROCESS_TAG=haproxy-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/50b90e9d-0874-4370-ad17-1fff2c4cce15.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:24:09 compute-0 podman[206935]: 2026-02-16 17:24:09.111780591 +0000 UTC m=+0.065234968 container create cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:24:09 compute-0 systemd[1]: Started libpod-conmon-cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0.scope.
Feb 16 17:24:09 compute-0 podman[206935]: 2026-02-16 17:24:09.076288872 +0000 UTC m=+0.029743349 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:24:09 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:24:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52cbc8d94a2edfa2e7f87f6cf5c4f30cbca076e6224891ea6a68124ef5667de2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:24:09 compute-0 podman[206935]: 2026-02-16 17:24:09.192192209 +0000 UTC m=+0.145646666 container init cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 16 17:24:09 compute-0 podman[206935]: 2026-02-16 17:24:09.196637898 +0000 UTC m=+0.150092295 container start cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 17:24:09 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[206951]: [NOTICE]   (206955) : New worker (206957) forked
Feb 16 17:24:09 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[206951]: [NOTICE]   (206955) : Loading success.
Feb 16 17:24:09 compute-0 nova_compute[186176]: 2026-02-16 17:24:09.257 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:10 compute-0 nova_compute[186176]: 2026-02-16 17:24:10.972 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:14 compute-0 nova_compute[186176]: 2026-02-16 17:24:14.260 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:15 compute-0 nova_compute[186176]: 2026-02-16 17:24:15.977 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:18 compute-0 ovn_controller[96437]: 2026-02-16T17:24:18Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4a:77:a3 10.100.0.7
Feb 16 17:24:18 compute-0 ovn_controller[96437]: 2026-02-16T17:24:18Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4a:77:a3 10.100.0.7
Feb 16 17:24:19 compute-0 nova_compute[186176]: 2026-02-16 17:24:19.262 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:21 compute-0 nova_compute[186176]: 2026-02-16 17:24:21.038 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:21 compute-0 podman[206984]: 2026-02-16 17:24:21.147306122 +0000 UTC m=+0.076351979 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible)
Feb 16 17:24:24 compute-0 nova_compute[186176]: 2026-02-16 17:24:24.265 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:25 compute-0 podman[207007]: 2026-02-16 17:24:25.097919752 +0000 UTC m=+0.064189202 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 17:24:26 compute-0 nova_compute[186176]: 2026-02-16 17:24:26.039 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:29 compute-0 nova_compute[186176]: 2026-02-16 17:24:29.267 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:29 compute-0 podman[195505]: time="2026-02-16T17:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:24:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:24:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2622 "" "Go-http-client/1.1"
Feb 16 17:24:30 compute-0 nova_compute[186176]: 2026-02-16 17:24:30.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:24:30 compute-0 nova_compute[186176]: 2026-02-16 17:24:30.334 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:24:30 compute-0 nova_compute[186176]: 2026-02-16 17:24:30.335 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:24:30 compute-0 nova_compute[186176]: 2026-02-16 17:24:30.335 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:24:31 compute-0 nova_compute[186176]: 2026-02-16 17:24:31.041 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:31 compute-0 nova_compute[186176]: 2026-02-16 17:24:31.047 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:24:31 compute-0 nova_compute[186176]: 2026-02-16 17:24:31.048 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:24:31 compute-0 nova_compute[186176]: 2026-02-16 17:24:31.048 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:24:31 compute-0 nova_compute[186176]: 2026-02-16 17:24:31.048 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 970cfdb5-b102-447c-9730-04f44a0117c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:24:31 compute-0 openstack_network_exporter[198360]: ERROR   17:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:24:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:24:31 compute-0 openstack_network_exporter[198360]: ERROR   17:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:24:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:24:32 compute-0 podman[207027]: 2026-02-16 17:24:32.10437724 +0000 UTC m=+0.071715426 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:24:32 compute-0 podman[207026]: 2026-02-16 17:24:32.142949824 +0000 UTC m=+0.110103385 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.271 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.310 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Updating instance_info_cache with network_info: [{"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.328 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.329 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.329 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.329 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.329 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.330 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.330 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.350 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.413 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.481 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.482 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.527 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.722 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.724 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5694MB free_disk=73.1997299194336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.724 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.725 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.816 186180 INFO nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Updating resource usage from migration ad5225c4-2c71-406e-803b-68778cae39ad
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.844 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Migration ad5225c4-2c71-406e-803b-68778cae39ad is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.844 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.845 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.889 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.924 186180 ERROR nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [req-a4f2cb81-6936-41f7-b184-4084a9cf729a] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID bb904aac-529f-46ef-9861-9c655a4b383c.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-a4f2cb81-6936-41f7-b184-4084a9cf729a"}]}
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.939 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.956 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.957 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.969 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 17:24:34 compute-0 nova_compute[186176]: 2026-02-16 17:24:34.992 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 17:24:35 compute-0 nova_compute[186176]: 2026-02-16 17:24:35.034 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:24:35 compute-0 nova_compute[186176]: 2026-02-16 17:24:35.071 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updated inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 16 17:24:35 compute-0 nova_compute[186176]: 2026-02-16 17:24:35.071 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating resource provider bb904aac-529f-46ef-9861-9c655a4b383c generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 17:24:35 compute-0 nova_compute[186176]: 2026-02-16 17:24:35.072 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:24:35 compute-0 nova_compute[186176]: 2026-02-16 17:24:35.095 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:24:35 compute-0 nova_compute[186176]: 2026-02-16 17:24:35.095 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.371s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:36 compute-0 nova_compute[186176]: 2026-02-16 17:24:36.079 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:36 compute-0 nova_compute[186176]: 2026-02-16 17:24:36.082 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:24:36 compute-0 nova_compute[186176]: 2026-02-16 17:24:36.083 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:24:36 compute-0 nova_compute[186176]: 2026-02-16 17:24:36.322 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:24:36 compute-0 nova_compute[186176]: 2026-02-16 17:24:36.563 186180 DEBUG oslo_concurrency.lockutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:24:36 compute-0 nova_compute[186176]: 2026-02-16 17:24:36.564 186180 DEBUG oslo_concurrency.lockutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:24:36 compute-0 nova_compute[186176]: 2026-02-16 17:24:36.564 186180 DEBUG nova.network.neutron [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:24:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:38.151 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:38.152 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:38.152 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:38 compute-0 nova_compute[186176]: 2026-02-16 17:24:38.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.199 186180 DEBUG nova.network.neutron [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Updating instance_info_cache with network_info: [{"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.215 186180 DEBUG oslo_concurrency.lockutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.276 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.306 186180 DEBUG nova.virt.libvirt.driver [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.306 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Creating file /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/3752b5af27c64eba834fd1c5978e03cb.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.306 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/3752b5af27c64eba834fd1c5978e03cb.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.683 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/3752b5af27c64eba834fd1c5978e03cb.tmp" returned: 1 in 0.377s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.686 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/3752b5af27c64eba834fd1c5978e03cb.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.686 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Creating directory /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.687 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.880 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:39 compute-0 nova_compute[186176]: 2026-02-16 17:24:39.885 186180 DEBUG nova.virt.libvirt.driver [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Feb 16 17:24:41 compute-0 nova_compute[186176]: 2026-02-16 17:24:41.082 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 kernel: tap45f5a5dc-c3 (unregistering): left promiscuous mode
Feb 16 17:24:42 compute-0 NetworkManager[56463]: <info>  [1771262682.1121] device (tap45f5a5dc-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:24:42 compute-0 ovn_controller[96437]: 2026-02-16T17:24:42Z|00032|binding|INFO|Releasing lport 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c from this chassis (sb_readonly=0)
Feb 16 17:24:42 compute-0 ovn_controller[96437]: 2026-02-16T17:24:42Z|00033|binding|INFO|Setting lport 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c down in Southbound
Feb 16 17:24:42 compute-0 ovn_controller[96437]: 2026-02-16T17:24:42Z|00034|binding|INFO|Removing iface tap45f5a5dc-c3 ovn-installed in OVS
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.118 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.120 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.127 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.132 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4a:77:a3 10.100.0.7'], port_security=['fa:16:3e:4a:77:a3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '970cfdb5-b102-447c-9730-04f44a0117c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a09b650a-b8da-4ec6-af84-f46bd29af7dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18cf134a-5b0b-4046-bb3d-fdfa0b081c31, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.134 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c in datapath 50b90e9d-0874-4370-ad17-1fff2c4cce15 unbound from our chassis
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.135 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 50b90e9d-0874-4370-ad17-1fff2c4cce15, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.137 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe7944c-b7a7-4aac-b881-68ce8f8bad4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.137 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15 namespace which is not needed anymore
Feb 16 17:24:42 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Feb 16 17:24:42 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 13.400s CPU time.
Feb 16 17:24:42 compute-0 systemd-machined[155631]: Machine qemu-1-instance-00000001 terminated.
Feb 16 17:24:42 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[206951]: [NOTICE]   (206955) : haproxy version is 2.8.14-c23fe91
Feb 16 17:24:42 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[206951]: [NOTICE]   (206955) : path to executable is /usr/sbin/haproxy
Feb 16 17:24:42 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[206951]: [WARNING]  (206955) : Exiting Master process...
Feb 16 17:24:42 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[206951]: [WARNING]  (206955) : Exiting Master process...
Feb 16 17:24:42 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[206951]: [ALERT]    (206955) : Current worker (206957) exited with code 143 (Terminated)
Feb 16 17:24:42 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[206951]: [WARNING]  (206955) : All workers exited. Exiting... (0)
Feb 16 17:24:42 compute-0 systemd[1]: libpod-cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0.scope: Deactivated successfully.
Feb 16 17:24:42 compute-0 podman[207120]: 2026-02-16 17:24:42.304898782 +0000 UTC m=+0.052589138 container died cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:24:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0-userdata-shm.mount: Deactivated successfully.
Feb 16 17:24:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-52cbc8d94a2edfa2e7f87f6cf5c4f30cbca076e6224891ea6a68124ef5667de2-merged.mount: Deactivated successfully.
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.343 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 podman[207120]: 2026-02-16 17:24:42.345400254 +0000 UTC m=+0.093090610 container cleanup cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.351 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 systemd[1]: libpod-conmon-cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0.scope: Deactivated successfully.
Feb 16 17:24:42 compute-0 podman[207158]: 2026-02-16 17:24:42.428019696 +0000 UTC m=+0.055076449 container remove cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.431 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4e4608-7699-4a2c-9b51-619e21290c74]: (4, ('Mon Feb 16 05:24:42 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15 (cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0)\ncffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0\nMon Feb 16 05:24:42 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15 (cffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0)\ncffc78fe9d3e0cdaf2066c46265cf4abb799835dc5f8bf7a2814a029c983c6b0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.434 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a8f1bd-6ab3-4cfd-bd0c-0750a734351c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.435 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b90e9d-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:24:42 compute-0 kernel: tap50b90e9d-00: left promiscuous mode
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.437 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.445 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.448 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b9725996-8a14-412c-af08-f684549b72ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.464 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e4fc3461-2be5-47b2-8c07-1aa8e669cdc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.466 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[0aa12ad7-fc13-47e9-91a8-bcb4fd73e3fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.481 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[9bdca979-bb09-4b20-908b-d41392f3f71e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424400, 'reachable_time': 41349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207183, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:42 compute-0 systemd[1]: run-netns-ovnmeta\x2d50b90e9d\x2d0874\x2d4370\x2dad17\x2d1fff2c4cce15.mount: Deactivated successfully.
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.499 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.500 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[ac200e6f-f073-4a71-95d1-bf909e4af7d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.626 186180 DEBUG nova.compute.manager [req-bf92df86-91e9-4c2d-a6cc-bd39219b2373 req-04dd74fc-822d-45b8-993a-023eeb28c40a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received event network-vif-unplugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.627 186180 DEBUG oslo_concurrency.lockutils [req-bf92df86-91e9-4c2d-a6cc-bd39219b2373 req-04dd74fc-822d-45b8-993a-023eeb28c40a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.628 186180 DEBUG oslo_concurrency.lockutils [req-bf92df86-91e9-4c2d-a6cc-bd39219b2373 req-04dd74fc-822d-45b8-993a-023eeb28c40a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.628 186180 DEBUG oslo_concurrency.lockutils [req-bf92df86-91e9-4c2d-a6cc-bd39219b2373 req-04dd74fc-822d-45b8-993a-023eeb28c40a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.629 186180 DEBUG nova.compute.manager [req-bf92df86-91e9-4c2d-a6cc-bd39219b2373 req-04dd74fc-822d-45b8-993a-023eeb28c40a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] No waiting events found dispatching network-vif-unplugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.629 186180 WARNING nova.compute.manager [req-bf92df86-91e9-4c2d-a6cc-bd39219b2373 req-04dd74fc-822d-45b8-993a-023eeb28c40a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received unexpected event network-vif-unplugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c for instance with vm_state active and task_state resize_migrating.
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.727 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.727 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:42.729 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.906 186180 INFO nova.virt.libvirt.driver [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Instance shutdown successfully after 3 seconds.
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.912 186180 INFO nova.virt.libvirt.driver [-] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Instance destroyed successfully.
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.914 186180 DEBUG nova.virt.libvirt.vif [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:23:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-168706571',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-168706571',id=1,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:24:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-ncq3jj3z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:24:35Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=970cfdb5-b102-447c-9730-04f44a0117c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "vif_mac": "fa:16:3e:4a:77:a3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.914 186180 DEBUG nova.network.os_vif_util [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "vif_mac": "fa:16:3e:4a:77:a3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.915 186180 DEBUG nova.network.os_vif_util [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:77:a3,bridge_name='br-int',has_traffic_filtering=True,id=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45f5a5dc-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.915 186180 DEBUG os_vif [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:77:a3,bridge_name='br-int',has_traffic_filtering=True,id=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45f5a5dc-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.917 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.918 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45f5a5dc-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.919 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.921 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.925 186180 INFO os_vif [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:77:a3,bridge_name='br-int',has_traffic_filtering=True,id=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45f5a5dc-c3')
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.931 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.994 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:42 compute-0 nova_compute[186176]: 2026-02-16 17:24:42.995 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:43 compute-0 nova_compute[186176]: 2026-02-16 17:24:43.043 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:43 compute-0 nova_compute[186176]: 2026-02-16 17:24:43.045 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Copying file /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2_resize/disk to 192.168.122.101:/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 17:24:43 compute-0 nova_compute[186176]: 2026-02-16 17:24:43.045 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2_resize/disk 192.168.122.101:/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:43 compute-0 nova_compute[186176]: 2026-02-16 17:24:43.500 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "scp -r /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2_resize/disk 192.168.122.101:/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:43 compute-0 nova_compute[186176]: 2026-02-16 17:24:43.501 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Copying file /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 17:24:43 compute-0 nova_compute[186176]: 2026-02-16 17:24:43.501 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2_resize/disk.config 192.168.122.101:/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:43 compute-0 nova_compute[186176]: 2026-02-16 17:24:43.718 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "scp -C -r /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2_resize/disk.config 192.168.122.101:/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.config" returned: 0 in 0.217s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:43 compute-0 nova_compute[186176]: 2026-02-16 17:24:43.720 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Copying file /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 17:24:43 compute-0 nova_compute[186176]: 2026-02-16 17:24:43.720 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2_resize/disk.info 192.168.122.101:/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:24:43 compute-0 nova_compute[186176]: 2026-02-16 17:24:43.913 186180 DEBUG oslo_concurrency.processutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "scp -C -r /var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2_resize/disk.info 192.168.122.101:/var/lib/nova/instances/970cfdb5-b102-447c-9730-04f44a0117c2/disk.info" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:24:44 compute-0 nova_compute[186176]: 2026-02-16 17:24:44.783 186180 DEBUG nova.compute.manager [req-64c49c50-f05b-4241-8003-144d47ad8c07 req-165002e0-8564-4a74-a288-6659d98263a1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:24:44 compute-0 nova_compute[186176]: 2026-02-16 17:24:44.784 186180 DEBUG oslo_concurrency.lockutils [req-64c49c50-f05b-4241-8003-144d47ad8c07 req-165002e0-8564-4a74-a288-6659d98263a1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:44 compute-0 nova_compute[186176]: 2026-02-16 17:24:44.784 186180 DEBUG oslo_concurrency.lockutils [req-64c49c50-f05b-4241-8003-144d47ad8c07 req-165002e0-8564-4a74-a288-6659d98263a1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:44 compute-0 nova_compute[186176]: 2026-02-16 17:24:44.785 186180 DEBUG oslo_concurrency.lockutils [req-64c49c50-f05b-4241-8003-144d47ad8c07 req-165002e0-8564-4a74-a288-6659d98263a1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:44 compute-0 nova_compute[186176]: 2026-02-16 17:24:44.785 186180 DEBUG nova.compute.manager [req-64c49c50-f05b-4241-8003-144d47ad8c07 req-165002e0-8564-4a74-a288-6659d98263a1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] No waiting events found dispatching network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:24:44 compute-0 nova_compute[186176]: 2026-02-16 17:24:44.786 186180 WARNING nova.compute.manager [req-64c49c50-f05b-4241-8003-144d47ad8c07 req-165002e0-8564-4a74-a288-6659d98263a1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received unexpected event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c for instance with vm_state active and task_state resize_migrating.
Feb 16 17:24:45 compute-0 nova_compute[186176]: 2026-02-16 17:24:45.064 186180 DEBUG neutronclient.v2_0.client [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Feb 16 17:24:45 compute-0 nova_compute[186176]: 2026-02-16 17:24:45.157 186180 DEBUG oslo_concurrency.lockutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:24:45 compute-0 nova_compute[186176]: 2026-02-16 17:24:45.158 186180 DEBUG oslo_concurrency.lockutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:24:45 compute-0 nova_compute[186176]: 2026-02-16 17:24:45.168 186180 INFO nova.compute.rpcapi [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Feb 16 17:24:45 compute-0 nova_compute[186176]: 2026-02-16 17:24:45.169 186180 DEBUG oslo_concurrency.lockutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:24:45 compute-0 nova_compute[186176]: 2026-02-16 17:24:45.189 186180 DEBUG oslo_concurrency.lockutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:45 compute-0 nova_compute[186176]: 2026-02-16 17:24:45.190 186180 DEBUG oslo_concurrency.lockutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:45 compute-0 nova_compute[186176]: 2026-02-16 17:24:45.191 186180 DEBUG oslo_concurrency.lockutils [None req-67cedd7d-fd4f-4f72-a22a-c4c98021cb28 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:46 compute-0 nova_compute[186176]: 2026-02-16 17:24:46.085 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:24:46.732 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:24:46 compute-0 nova_compute[186176]: 2026-02-16 17:24:46.963 186180 DEBUG nova.compute.manager [req-85f4cc77-2105-4af2-84a9-7550edae95fc req-889be2a8-0339-4ba6-8eb6-1c81c93f7a68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received event network-changed-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:24:46 compute-0 nova_compute[186176]: 2026-02-16 17:24:46.964 186180 DEBUG nova.compute.manager [req-85f4cc77-2105-4af2-84a9-7550edae95fc req-889be2a8-0339-4ba6-8eb6-1c81c93f7a68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Refreshing instance network info cache due to event network-changed-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:24:46 compute-0 nova_compute[186176]: 2026-02-16 17:24:46.964 186180 DEBUG oslo_concurrency.lockutils [req-85f4cc77-2105-4af2-84a9-7550edae95fc req-889be2a8-0339-4ba6-8eb6-1c81c93f7a68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:24:46 compute-0 nova_compute[186176]: 2026-02-16 17:24:46.964 186180 DEBUG oslo_concurrency.lockutils [req-85f4cc77-2105-4af2-84a9-7550edae95fc req-889be2a8-0339-4ba6-8eb6-1c81c93f7a68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:24:46 compute-0 nova_compute[186176]: 2026-02-16 17:24:46.964 186180 DEBUG nova.network.neutron [req-85f4cc77-2105-4af2-84a9-7550edae95fc req-889be2a8-0339-4ba6-8eb6-1c81c93f7a68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Refreshing network info cache for port 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:24:47 compute-0 nova_compute[186176]: 2026-02-16 17:24:47.920 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:48 compute-0 nova_compute[186176]: 2026-02-16 17:24:48.541 186180 DEBUG nova.network.neutron [req-85f4cc77-2105-4af2-84a9-7550edae95fc req-889be2a8-0339-4ba6-8eb6-1c81c93f7a68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Updated VIF entry in instance network info cache for port 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:24:48 compute-0 nova_compute[186176]: 2026-02-16 17:24:48.541 186180 DEBUG nova.network.neutron [req-85f4cc77-2105-4af2-84a9-7550edae95fc req-889be2a8-0339-4ba6-8eb6-1c81c93f7a68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Updating instance_info_cache with network_info: [{"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:24:48 compute-0 nova_compute[186176]: 2026-02-16 17:24:48.557 186180 DEBUG oslo_concurrency.lockutils [req-85f4cc77-2105-4af2-84a9-7550edae95fc req-889be2a8-0339-4ba6-8eb6-1c81c93f7a68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:24:49 compute-0 nova_compute[186176]: 2026-02-16 17:24:49.128 186180 DEBUG nova.compute.manager [req-ff9e91d3-817a-45ff-8133-b32f465b157e req-f4d3c178-0b02-4277-b7af-5896d9fee5d0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:24:49 compute-0 nova_compute[186176]: 2026-02-16 17:24:49.128 186180 DEBUG oslo_concurrency.lockutils [req-ff9e91d3-817a-45ff-8133-b32f465b157e req-f4d3c178-0b02-4277-b7af-5896d9fee5d0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:49 compute-0 nova_compute[186176]: 2026-02-16 17:24:49.129 186180 DEBUG oslo_concurrency.lockutils [req-ff9e91d3-817a-45ff-8133-b32f465b157e req-f4d3c178-0b02-4277-b7af-5896d9fee5d0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:49 compute-0 nova_compute[186176]: 2026-02-16 17:24:49.129 186180 DEBUG oslo_concurrency.lockutils [req-ff9e91d3-817a-45ff-8133-b32f465b157e req-f4d3c178-0b02-4277-b7af-5896d9fee5d0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:49 compute-0 nova_compute[186176]: 2026-02-16 17:24:49.129 186180 DEBUG nova.compute.manager [req-ff9e91d3-817a-45ff-8133-b32f465b157e req-f4d3c178-0b02-4277-b7af-5896d9fee5d0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] No waiting events found dispatching network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:24:49 compute-0 nova_compute[186176]: 2026-02-16 17:24:49.129 186180 WARNING nova.compute.manager [req-ff9e91d3-817a-45ff-8133-b32f465b157e req-f4d3c178-0b02-4277-b7af-5896d9fee5d0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received unexpected event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c for instance with vm_state active and task_state resize_finish.
Feb 16 17:24:51 compute-0 nova_compute[186176]: 2026-02-16 17:24:51.130 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:51 compute-0 nova_compute[186176]: 2026-02-16 17:24:51.384 186180 DEBUG nova.compute.manager [req-0be0c689-8974-4033-af07-897e45ec9215 req-8e2afd5d-c37d-4d27-9b9b-1bd10011d219 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:24:51 compute-0 nova_compute[186176]: 2026-02-16 17:24:51.384 186180 DEBUG oslo_concurrency.lockutils [req-0be0c689-8974-4033-af07-897e45ec9215 req-8e2afd5d-c37d-4d27-9b9b-1bd10011d219 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:51 compute-0 nova_compute[186176]: 2026-02-16 17:24:51.385 186180 DEBUG oslo_concurrency.lockutils [req-0be0c689-8974-4033-af07-897e45ec9215 req-8e2afd5d-c37d-4d27-9b9b-1bd10011d219 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:51 compute-0 nova_compute[186176]: 2026-02-16 17:24:51.385 186180 DEBUG oslo_concurrency.lockutils [req-0be0c689-8974-4033-af07-897e45ec9215 req-8e2afd5d-c37d-4d27-9b9b-1bd10011d219 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:51 compute-0 nova_compute[186176]: 2026-02-16 17:24:51.385 186180 DEBUG nova.compute.manager [req-0be0c689-8974-4033-af07-897e45ec9215 req-8e2afd5d-c37d-4d27-9b9b-1bd10011d219 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] No waiting events found dispatching network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:24:51 compute-0 nova_compute[186176]: 2026-02-16 17:24:51.385 186180 WARNING nova.compute.manager [req-0be0c689-8974-4033-af07-897e45ec9215 req-8e2afd5d-c37d-4d27-9b9b-1bd10011d219 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Received unexpected event network-vif-plugged-45f5a5dc-c3f7-400d-80fb-aac0de24ed2c for instance with vm_state resized and task_state None.
Feb 16 17:24:52 compute-0 podman[207197]: 2026-02-16 17:24:52.11428359 +0000 UTC m=+0.079914287 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.7, architecture=x86_64, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1770267347, config_id=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 17:24:52 compute-0 nova_compute[186176]: 2026-02-16 17:24:52.924 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:53 compute-0 nova_compute[186176]: 2026-02-16 17:24:53.290 186180 DEBUG oslo_concurrency.lockutils [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "970cfdb5-b102-447c-9730-04f44a0117c2" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:53 compute-0 nova_compute[186176]: 2026-02-16 17:24:53.291 186180 DEBUG oslo_concurrency.lockutils [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:53 compute-0 nova_compute[186176]: 2026-02-16 17:24:53.291 186180 DEBUG nova.compute.manager [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679
Feb 16 17:24:54 compute-0 nova_compute[186176]: 2026-02-16 17:24:54.161 186180 DEBUG neutronclient.v2_0.client [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 45f5a5dc-c3f7-400d-80fb-aac0de24ed2c for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262
Feb 16 17:24:54 compute-0 nova_compute[186176]: 2026-02-16 17:24:54.163 186180 DEBUG oslo_concurrency.lockutils [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:24:54 compute-0 nova_compute[186176]: 2026-02-16 17:24:54.163 186180 DEBUG oslo_concurrency.lockutils [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:24:54 compute-0 nova_compute[186176]: 2026-02-16 17:24:54.163 186180 DEBUG nova.network.neutron [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:24:54 compute-0 nova_compute[186176]: 2026-02-16 17:24:54.164 186180 DEBUG nova.objects.instance [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'info_cache' on Instance uuid 970cfdb5-b102-447c-9730-04f44a0117c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:24:56 compute-0 podman[207219]: 2026-02-16 17:24:56.106641047 +0000 UTC m=+0.072027196 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.173 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.392 186180 DEBUG nova.network.neutron [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Updating instance_info_cache with network_info: [{"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.414 186180 DEBUG oslo_concurrency.lockutils [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-970cfdb5-b102-447c-9730-04f44a0117c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.414 186180 DEBUG nova.objects.instance [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid 970cfdb5-b102-447c-9730-04f44a0117c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.441 186180 DEBUG nova.virt.libvirt.host [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.442 186180 INFO nova.virt.libvirt.host [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] UEFI support detected
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.445 186180 DEBUG nova.virt.libvirt.vif [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:23:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-168706571',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-168706571',id=1,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:24:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-ncq3jj3z',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:24:49Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=970cfdb5-b102-447c-9730-04f44a0117c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.445 186180 DEBUG nova.network.os_vif_util [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "address": "fa:16:3e:4a:77:a3", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45f5a5dc-c3", "ovs_interfaceid": "45f5a5dc-c3f7-400d-80fb-aac0de24ed2c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.446 186180 DEBUG nova.network.os_vif_util [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4a:77:a3,bridge_name='br-int',has_traffic_filtering=True,id=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45f5a5dc-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.446 186180 DEBUG os_vif [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:77:a3,bridge_name='br-int',has_traffic_filtering=True,id=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45f5a5dc-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.447 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.448 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45f5a5dc-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.448 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.450 186180 INFO os_vif [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4a:77:a3,bridge_name='br-int',has_traffic_filtering=True,id=45f5a5dc-c3f7-400d-80fb-aac0de24ed2c,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45f5a5dc-c3')
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.451 186180 DEBUG oslo_concurrency.lockutils [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.451 186180 DEBUG oslo_concurrency.lockutils [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.523 186180 DEBUG nova.compute.provider_tree [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.540 186180 DEBUG nova.scheduler.client.report [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.581 186180 DEBUG oslo_concurrency.lockutils [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.699 186180 INFO nova.scheduler.client.report [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Deleted allocation for migration ad5225c4-2c71-406e-803b-68778cae39ad
Feb 16 17:24:56 compute-0 nova_compute[186176]: 2026-02-16 17:24:56.756 186180 DEBUG oslo_concurrency.lockutils [None req-109a7948-91b8-4938-b30f-14a2f0572d51 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "970cfdb5-b102-447c-9730-04f44a0117c2" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.465s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:24:57 compute-0 nova_compute[186176]: 2026-02-16 17:24:57.374 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771262682.3725653, 970cfdb5-b102-447c-9730-04f44a0117c2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:24:57 compute-0 nova_compute[186176]: 2026-02-16 17:24:57.374 186180 INFO nova.compute.manager [-] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] VM Stopped (Lifecycle Event)
Feb 16 17:24:57 compute-0 nova_compute[186176]: 2026-02-16 17:24:57.404 186180 DEBUG nova.compute.manager [None req-b8ba0568-8a09-4739-8fea-22c57d524529 - - - - - -] [instance: 970cfdb5-b102-447c-9730-04f44a0117c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:24:57 compute-0 nova_compute[186176]: 2026-02-16 17:24:57.927 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:24:59 compute-0 podman[195505]: time="2026-02-16T17:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:24:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:24:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2158 "" "Go-http-client/1.1"
Feb 16 17:25:01 compute-0 nova_compute[186176]: 2026-02-16 17:25:01.212 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:01 compute-0 openstack_network_exporter[198360]: ERROR   17:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:25:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:25:01 compute-0 openstack_network_exporter[198360]: ERROR   17:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:25:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:25:02 compute-0 nova_compute[186176]: 2026-02-16 17:25:02.929 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:03 compute-0 podman[207242]: 2026-02-16 17:25:03.09413519 +0000 UTC m=+0.057340438 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:25:03 compute-0 podman[207241]: 2026-02-16 17:25:03.120739598 +0000 UTC m=+0.083592748 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 17:25:06 compute-0 nova_compute[186176]: 2026-02-16 17:25:06.247 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:07 compute-0 nova_compute[186176]: 2026-02-16 17:25:07.931 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:11 compute-0 nova_compute[186176]: 2026-02-16 17:25:11.249 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:12 compute-0 nova_compute[186176]: 2026-02-16 17:25:12.935 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.375 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "49698b66-fe7c-4448-88b5-13f0281298da" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.376 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.394 186180 DEBUG nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.464 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.465 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.475 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.475 186180 INFO nova.compute.claims [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.571 186180 DEBUG nova.compute.provider_tree [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.585 186180 DEBUG nova.scheduler.client.report [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.609 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.610 186180 DEBUG nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.665 186180 DEBUG nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.665 186180 DEBUG nova.network.neutron [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.681 186180 INFO nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.702 186180 DEBUG nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.813 186180 DEBUG nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.815 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.816 186180 INFO nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Creating image(s)
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.817 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "/var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.818 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "/var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.819 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "/var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.844 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.888 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.890 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.891 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.917 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.979 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:13 compute-0 nova_compute[186176]: 2026-02-16 17:25:13.980 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.019 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.020 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.020 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.087 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.088 186180 DEBUG nova.virt.disk.api [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Checking if we can resize image /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.089 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.103 186180 DEBUG nova.policy [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '04e81d9e145a466bbabfe4fdaf9f09aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.164 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.165 186180 DEBUG nova.virt.disk.api [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Cannot resize image /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.165 186180 DEBUG nova.objects.instance [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lazy-loading 'migration_context' on Instance uuid 49698b66-fe7c-4448-88b5-13f0281298da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.184 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.184 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Ensure instance console log exists: /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.185 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.185 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.185 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:14 compute-0 nova_compute[186176]: 2026-02-16 17:25:14.568 186180 DEBUG nova.network.neutron [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Successfully created port: ed2a59c8-33d0-43c7-bb70-bee7dc282734 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:25:15 compute-0 nova_compute[186176]: 2026-02-16 17:25:15.370 186180 DEBUG nova.network.neutron [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Successfully updated port: ed2a59c8-33d0-43c7-bb70-bee7dc282734 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:25:15 compute-0 nova_compute[186176]: 2026-02-16 17:25:15.387 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "refresh_cache-49698b66-fe7c-4448-88b5-13f0281298da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:25:15 compute-0 nova_compute[186176]: 2026-02-16 17:25:15.387 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquired lock "refresh_cache-49698b66-fe7c-4448-88b5-13f0281298da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:25:15 compute-0 nova_compute[186176]: 2026-02-16 17:25:15.388 186180 DEBUG nova.network.neutron [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:25:15 compute-0 nova_compute[186176]: 2026-02-16 17:25:15.441 186180 DEBUG nova.compute.manager [req-0669f1bf-fb33-4aed-ad78-657ebcc3b876 req-d3552f94-a52b-44b6-826b-f354887e0929 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Received event network-changed-ed2a59c8-33d0-43c7-bb70-bee7dc282734 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:25:15 compute-0 nova_compute[186176]: 2026-02-16 17:25:15.442 186180 DEBUG nova.compute.manager [req-0669f1bf-fb33-4aed-ad78-657ebcc3b876 req-d3552f94-a52b-44b6-826b-f354887e0929 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Refreshing instance network info cache due to event network-changed-ed2a59c8-33d0-43c7-bb70-bee7dc282734. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:25:15 compute-0 nova_compute[186176]: 2026-02-16 17:25:15.442 186180 DEBUG oslo_concurrency.lockutils [req-0669f1bf-fb33-4aed-ad78-657ebcc3b876 req-d3552f94-a52b-44b6-826b-f354887e0929 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-49698b66-fe7c-4448-88b5-13f0281298da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:25:15 compute-0 nova_compute[186176]: 2026-02-16 17:25:15.521 186180 DEBUG nova.network.neutron [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.236 186180 DEBUG nova.network.neutron [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Updating instance_info_cache with network_info: [{"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.259 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Releasing lock "refresh_cache-49698b66-fe7c-4448-88b5-13f0281298da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.260 186180 DEBUG nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Instance network_info: |[{"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.261 186180 DEBUG oslo_concurrency.lockutils [req-0669f1bf-fb33-4aed-ad78-657ebcc3b876 req-d3552f94-a52b-44b6-826b-f354887e0929 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-49698b66-fe7c-4448-88b5-13f0281298da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.261 186180 DEBUG nova.network.neutron [req-0669f1bf-fb33-4aed-ad78-657ebcc3b876 req-d3552f94-a52b-44b6-826b-f354887e0929 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Refreshing network info cache for port ed2a59c8-33d0-43c7-bb70-bee7dc282734 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.266 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Start _get_guest_xml network_info=[{"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.272 186180 WARNING nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.277 186180 DEBUG nova.virt.libvirt.host [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.278 186180 DEBUG nova.virt.libvirt.host [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.287 186180 DEBUG nova.virt.libvirt.host [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.288 186180 DEBUG nova.virt.libvirt.host [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.291 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.292 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.293 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.293 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.294 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.294 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.295 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.295 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.296 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.296 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.297 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.297 186180 DEBUG nova.virt.hardware [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.306 186180 DEBUG nova.virt.libvirt.vif [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:25:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1920538887',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1920538887',id=4,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-g4zymj4c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:25:13Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=49698b66-fe7c-4448-88b5-13f0281298da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.307 186180 DEBUG nova.network.os_vif_util [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converting VIF {"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.308 186180 DEBUG nova.network.os_vif_util [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:d1:25,bridge_name='br-int',has_traffic_filtering=True,id=ed2a59c8-33d0-43c7-bb70-bee7dc282734,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped2a59c8-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.311 186180 DEBUG nova.objects.instance [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lazy-loading 'pci_devices' on Instance uuid 49698b66-fe7c-4448-88b5-13f0281298da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.313 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.330 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <uuid>49698b66-fe7c-4448-88b5-13f0281298da</uuid>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <name>instance-00000004</name>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1920538887</nova:name>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:25:16</nova:creationTime>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:25:16 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:25:16 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:25:16 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:25:16 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:25:16 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:25:16 compute-0 nova_compute[186176]:         <nova:user uuid="04e81d9e145a466bbabfe4fdaf9f09aa">tempest-TestExecuteActionsViaActuator-900316824-project-member</nova:user>
Feb 16 17:25:16 compute-0 nova_compute[186176]:         <nova:project uuid="97a4c97daa7a495f91b4f65a132f7c0f">tempest-TestExecuteActionsViaActuator-900316824</nova:project>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:25:16 compute-0 nova_compute[186176]:         <nova:port uuid="ed2a59c8-33d0-43c7-bb70-bee7dc282734">
Feb 16 17:25:16 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <system>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <entry name="serial">49698b66-fe7c-4448-88b5-13f0281298da</entry>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <entry name="uuid">49698b66-fe7c-4448-88b5-13f0281298da</entry>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     </system>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <os>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   </os>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <features>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   </features>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk.config"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:c0:d1:25"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <target dev="taped2a59c8-33"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/console.log" append="off"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <video>
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     </video>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:25:16 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:25:16 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:25:16 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:25:16 compute-0 nova_compute[186176]: </domain>
Feb 16 17:25:16 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.331 186180 DEBUG nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Preparing to wait for external event network-vif-plugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.332 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "49698b66-fe7c-4448-88b5-13f0281298da-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.333 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.333 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.334 186180 DEBUG nova.virt.libvirt.vif [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:25:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1920538887',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1920538887',id=4,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-g4zymj4c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:25:13Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=49698b66-fe7c-4448-88b5-13f0281298da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.335 186180 DEBUG nova.network.os_vif_util [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converting VIF {"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.336 186180 DEBUG nova.network.os_vif_util [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:d1:25,bridge_name='br-int',has_traffic_filtering=True,id=ed2a59c8-33d0-43c7-bb70-bee7dc282734,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped2a59c8-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.336 186180 DEBUG os_vif [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:d1:25,bridge_name='br-int',has_traffic_filtering=True,id=ed2a59c8-33d0-43c7-bb70-bee7dc282734,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped2a59c8-33') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.337 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.338 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.338 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.342 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.343 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped2a59c8-33, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.343 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=taped2a59c8-33, col_values=(('external_ids', {'iface-id': 'ed2a59c8-33d0-43c7-bb70-bee7dc282734', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:d1:25', 'vm-uuid': '49698b66-fe7c-4448-88b5-13f0281298da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.345 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:16 compute-0 NetworkManager[56463]: <info>  [1771262716.3471] manager: (taped2a59c8-33): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.348 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.352 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.353 186180 INFO os_vif [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:d1:25,bridge_name='br-int',has_traffic_filtering=True,id=ed2a59c8-33d0-43c7-bb70-bee7dc282734,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped2a59c8-33')
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.409 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.409 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.410 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] No VIF found with MAC fa:16:3e:c0:d1:25, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.410 186180 INFO nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Using config drive
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.762 186180 INFO nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Creating config drive at /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk.config
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.769 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcg2ihwno execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.897 186180 DEBUG oslo_concurrency.processutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpcg2ihwno" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:16 compute-0 kernel: taped2a59c8-33: entered promiscuous mode
Feb 16 17:25:16 compute-0 ovn_controller[96437]: 2026-02-16T17:25:16Z|00035|binding|INFO|Claiming lport ed2a59c8-33d0-43c7-bb70-bee7dc282734 for this chassis.
Feb 16 17:25:16 compute-0 ovn_controller[96437]: 2026-02-16T17:25:16Z|00036|binding|INFO|ed2a59c8-33d0-43c7-bb70-bee7dc282734: Claiming fa:16:3e:c0:d1:25 10.100.0.11
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.963 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:16 compute-0 NetworkManager[56463]: <info>  [1771262716.9660] manager: (taped2a59c8-33): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Feb 16 17:25:16 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:16.973 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:d1:25 10.100.0.11'], port_security=['fa:16:3e:c0:d1:25 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '49698b66-fe7c-4448-88b5-13f0281298da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a09b650a-b8da-4ec6-af84-f46bd29af7dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18cf134a-5b0b-4046-bb3d-fdfa0b081c31, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=ed2a59c8-33d0-43c7-bb70-bee7dc282734) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:25:16 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:16.975 105730 INFO neutron.agent.ovn.metadata.agent [-] Port ed2a59c8-33d0-43c7-bb70-bee7dc282734 in datapath 50b90e9d-0874-4370-ad17-1fff2c4cce15 bound to our chassis
Feb 16 17:25:16 compute-0 ovn_controller[96437]: 2026-02-16T17:25:16Z|00037|binding|INFO|Setting lport ed2a59c8-33d0-43c7-bb70-bee7dc282734 up in Southbound
Feb 16 17:25:16 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:16.976 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:25:16 compute-0 ovn_controller[96437]: 2026-02-16T17:25:16Z|00038|binding|INFO|Setting lport ed2a59c8-33d0-43c7-bb70-bee7dc282734 ovn-installed in OVS
Feb 16 17:25:16 compute-0 nova_compute[186176]: 2026-02-16 17:25:16.978 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:16 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:16.987 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[c884cdff-7e21-4c39-8008-8d58a7343655]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:16 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:16.989 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap50b90e9d-01 in ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:25:16 compute-0 systemd-machined[155631]: New machine qemu-2-instance-00000004.
Feb 16 17:25:16 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:16.991 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap50b90e9d-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:25:16 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:16.992 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[6016bdf4-e974-4f06-a496-57485c62d772]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:16 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:16.993 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[06f7203b-d969-4131-ad24-f4775afc2ae0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.004 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[e803b1b5-9953-4abb-9507-627eecc5d830]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.030 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[cda9ded3-aacc-42fd-881c-0e7e4a555179]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 systemd-udevd[207330]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:25:17 compute-0 NetworkManager[56463]: <info>  [1771262717.0471] device (taped2a59c8-33): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:25:17 compute-0 NetworkManager[56463]: <info>  [1771262717.0483] device (taped2a59c8-33): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.059 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[682cb561-ff85-4640-a26f-b1370af60f20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.065 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[42256f1e-5094-4924-beb8-03c3699afe19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 systemd-udevd[207336]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:25:17 compute-0 NetworkManager[56463]: <info>  [1771262717.0667] manager: (tap50b90e9d-00): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.092 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[73c3bced-2e58-4564-8cd1-ef0c9794f391]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.098 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[813cb4a1-4d19-40f3-a03e-f9fae35c6f5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 NetworkManager[56463]: <info>  [1771262717.1158] device (tap50b90e9d-00): carrier: link connected
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.117 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[ec60caeb-9a46-4316-86ac-bcc3e1d3a12c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.133 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[77c6c365-b940-40db-9bc0-ed2419b85866]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b90e9d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:d8:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431264, 'reachable_time': 25027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207361, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.145 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[fa934468-cfd4-4592-82d1-b4d178e7a7e8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8c:d889'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431264, 'tstamp': 431264}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207362, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.156 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b18728c3-3bf9-4359-a4f6-12144b03701b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b90e9d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:d8:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431264, 'reachable_time': 25027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 207363, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.175 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[276ee474-c71e-4786-8bf1-e90b5b0d68be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.209 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[84195eea-91cf-4a4c-b21d-260c2f2adccc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.211 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b90e9d-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.212 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.213 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b90e9d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:17 compute-0 kernel: tap50b90e9d-00: entered promiscuous mode
Feb 16 17:25:17 compute-0 NetworkManager[56463]: <info>  [1771262717.2164] manager: (tap50b90e9d-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.215 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.217 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.222 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap50b90e9d-00, col_values=(('external_ids', {'iface-id': '7e8ec4b7-6252-49aa-a342-59a2b0f3de95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.224 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:17 compute-0 ovn_controller[96437]: 2026-02-16T17:25:17Z|00039|binding|INFO|Releasing lport 7e8ec4b7-6252-49aa-a342-59a2b0f3de95 from this chassis (sb_readonly=0)
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.224 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.225 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/50b90e9d-0874-4370-ad17-1fff2c4cce15.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/50b90e9d-0874-4370-ad17-1fff2c4cce15.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.229 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[0d13f6bb-ef81-4b17-a37b-d53a996ddb69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.229 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.230 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/50b90e9d-0874-4370-ad17-1fff2c4cce15.pid.haproxy
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:25:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:17.231 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'env', 'PROCESS_TAG=haproxy-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/50b90e9d-0874-4370-ad17-1fff2c4cce15.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.380 186180 DEBUG nova.network.neutron [req-0669f1bf-fb33-4aed-ad78-657ebcc3b876 req-d3552f94-a52b-44b6-826b-f354887e0929 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Updated VIF entry in instance network info cache for port ed2a59c8-33d0-43c7-bb70-bee7dc282734. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.381 186180 DEBUG nova.network.neutron [req-0669f1bf-fb33-4aed-ad78-657ebcc3b876 req-d3552f94-a52b-44b6-826b-f354887e0929 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Updating instance_info_cache with network_info: [{"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.424 186180 DEBUG oslo_concurrency.lockutils [req-0669f1bf-fb33-4aed-ad78-657ebcc3b876 req-d3552f94-a52b-44b6-826b-f354887e0929 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-49698b66-fe7c-4448-88b5-13f0281298da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.435 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262717.4345284, 49698b66-fe7c-4448-88b5-13f0281298da => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.436 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] VM Started (Lifecycle Event)
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.475 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.480 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262717.435723, 49698b66-fe7c-4448-88b5-13f0281298da => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.480 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] VM Paused (Lifecycle Event)
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.519 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:25:17 compute-0 podman[207402]: 2026-02-16 17:25:17.609672286 +0000 UTC m=+0.073919163 container create d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:25:17 compute-0 systemd[1]: Started libpod-conmon-d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313.scope.
Feb 16 17:25:17 compute-0 podman[207402]: 2026-02-16 17:25:17.572391637 +0000 UTC m=+0.036638574 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.685 186180 DEBUG nova.compute.manager [req-3d9641b5-c454-471b-ae1e-abd7cc3f47db req-972c5332-7397-48e6-ac3b-c080baeafa0a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Received event network-vif-plugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.685 186180 DEBUG oslo_concurrency.lockutils [req-3d9641b5-c454-471b-ae1e-abd7cc3f47db req-972c5332-7397-48e6-ac3b-c080baeafa0a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "49698b66-fe7c-4448-88b5-13f0281298da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.686 186180 DEBUG oslo_concurrency.lockutils [req-3d9641b5-c454-471b-ae1e-abd7cc3f47db req-972c5332-7397-48e6-ac3b-c080baeafa0a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.686 186180 DEBUG oslo_concurrency.lockutils [req-3d9641b5-c454-471b-ae1e-abd7cc3f47db req-972c5332-7397-48e6-ac3b-c080baeafa0a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.686 186180 DEBUG nova.compute.manager [req-3d9641b5-c454-471b-ae1e-abd7cc3f47db req-972c5332-7397-48e6-ac3b-c080baeafa0a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Processing event network-vif-plugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.687 186180 DEBUG nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.690 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.692 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.696 186180 INFO nova.virt.libvirt.driver [-] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Instance spawned successfully.
Feb 16 17:25:17 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.696 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:25:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e53cb9178fa9571351b9fc56bc15852c57a57b23810b531f4258b50d611da39/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.713 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.713 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262717.6925592, 49698b66-fe7c-4448-88b5-13f0281298da => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.713 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] VM Resumed (Lifecycle Event)
Feb 16 17:25:17 compute-0 podman[207402]: 2026-02-16 17:25:17.720016814 +0000 UTC m=+0.184263741 container init d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.722 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.722 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.723 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.723 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.723 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.724 186180 DEBUG nova.virt.libvirt.driver [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:17 compute-0 podman[207402]: 2026-02-16 17:25:17.726274467 +0000 UTC m=+0.190521344 container start d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.729 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.737 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:25:17 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[207417]: [NOTICE]   (207421) : New worker (207423) forked
Feb 16 17:25:17 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[207417]: [NOTICE]   (207421) : Loading success.
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.766 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.785 186180 INFO nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Took 3.97 seconds to spawn the instance on the hypervisor.
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.786 186180 DEBUG nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.866 186180 INFO nova.compute.manager [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Took 4.43 seconds to build instance.
Feb 16 17:25:17 compute-0 nova_compute[186176]: 2026-02-16 17:25:17.882 186180 DEBUG oslo_concurrency.lockutils [None req-6e23b46f-e93f-41b4-8676-5e9b4f32a676 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.506s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:19 compute-0 nova_compute[186176]: 2026-02-16 17:25:19.662 186180 DEBUG nova.compute.manager [req-5a1a119e-f912-4959-87cf-510b5f724043 req-16308550-149b-4a94-b266-0a287aa54725 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Received event network-vif-plugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:25:19 compute-0 nova_compute[186176]: 2026-02-16 17:25:19.662 186180 DEBUG oslo_concurrency.lockutils [req-5a1a119e-f912-4959-87cf-510b5f724043 req-16308550-149b-4a94-b266-0a287aa54725 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "49698b66-fe7c-4448-88b5-13f0281298da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:19 compute-0 nova_compute[186176]: 2026-02-16 17:25:19.663 186180 DEBUG oslo_concurrency.lockutils [req-5a1a119e-f912-4959-87cf-510b5f724043 req-16308550-149b-4a94-b266-0a287aa54725 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:19 compute-0 nova_compute[186176]: 2026-02-16 17:25:19.663 186180 DEBUG oslo_concurrency.lockutils [req-5a1a119e-f912-4959-87cf-510b5f724043 req-16308550-149b-4a94-b266-0a287aa54725 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:19 compute-0 nova_compute[186176]: 2026-02-16 17:25:19.663 186180 DEBUG nova.compute.manager [req-5a1a119e-f912-4959-87cf-510b5f724043 req-16308550-149b-4a94-b266-0a287aa54725 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] No waiting events found dispatching network-vif-plugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:25:19 compute-0 nova_compute[186176]: 2026-02-16 17:25:19.663 186180 WARNING nova.compute.manager [req-5a1a119e-f912-4959-87cf-510b5f724043 req-16308550-149b-4a94-b266-0a287aa54725 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Received unexpected event network-vif-plugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 for instance with vm_state active and task_state None.
Feb 16 17:25:21 compute-0 nova_compute[186176]: 2026-02-16 17:25:21.338 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:21 compute-0 nova_compute[186176]: 2026-02-16 17:25:21.344 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:23 compute-0 podman[207432]: 2026-02-16 17:25:23.108134835 +0000 UTC m=+0.078333359 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter)
Feb 16 17:25:26 compute-0 nova_compute[186176]: 2026-02-16 17:25:26.343 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:27 compute-0 podman[207455]: 2026-02-16 17:25:27.106233046 +0000 UTC m=+0.064721798 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 17:25:28 compute-0 ovn_controller[96437]: 2026-02-16T17:25:28Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c0:d1:25 10.100.0.11
Feb 16 17:25:28 compute-0 ovn_controller[96437]: 2026-02-16T17:25:28Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:d1:25 10.100.0.11
Feb 16 17:25:29 compute-0 podman[195505]: time="2026-02-16T17:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:25:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:25:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2630 "" "Go-http-client/1.1"
Feb 16 17:25:31 compute-0 nova_compute[186176]: 2026-02-16 17:25:31.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:25:31 compute-0 nova_compute[186176]: 2026-02-16 17:25:31.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:25:31 compute-0 nova_compute[186176]: 2026-02-16 17:25:31.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:25:31 compute-0 nova_compute[186176]: 2026-02-16 17:25:31.346 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:31 compute-0 openstack_network_exporter[198360]: ERROR   17:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:25:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:25:31 compute-0 openstack_network_exporter[198360]: ERROR   17:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:25:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:25:31 compute-0 nova_compute[186176]: 2026-02-16 17:25:31.561 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-49698b66-fe7c-4448-88b5-13f0281298da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:25:31 compute-0 nova_compute[186176]: 2026-02-16 17:25:31.561 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-49698b66-fe7c-4448-88b5-13f0281298da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:25:31 compute-0 nova_compute[186176]: 2026-02-16 17:25:31.561 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:25:31 compute-0 nova_compute[186176]: 2026-02-16 17:25:31.561 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 49698b66-fe7c-4448-88b5-13f0281298da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:25:33 compute-0 nova_compute[186176]: 2026-02-16 17:25:33.282 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Updating instance_info_cache with network_info: [{"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:25:33 compute-0 nova_compute[186176]: 2026-02-16 17:25:33.300 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-49698b66-fe7c-4448-88b5-13f0281298da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:25:33 compute-0 nova_compute[186176]: 2026-02-16 17:25:33.301 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:25:33 compute-0 nova_compute[186176]: 2026-02-16 17:25:33.301 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:25:33 compute-0 nova_compute[186176]: 2026-02-16 17:25:33.302 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:25:33 compute-0 nova_compute[186176]: 2026-02-16 17:25:33.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:25:34 compute-0 podman[207497]: 2026-02-16 17:25:34.078806443 +0000 UTC m=+0.049262621 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:25:34 compute-0 podman[207496]: 2026-02-16 17:25:34.183926595 +0000 UTC m=+0.155242994 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.337 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.338 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.338 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.339 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.403 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.478 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.479 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.554 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.690 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.692 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5696MB free_disk=73.19894790649414GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.693 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.693 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.762 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance 49698b66-fe7c-4448-88b5-13f0281298da actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.763 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.763 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.808 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.824 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.847 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:25:34 compute-0 nova_compute[186176]: 2026-02-16 17:25:34.847 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:35 compute-0 nova_compute[186176]: 2026-02-16 17:25:35.843 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:25:35 compute-0 nova_compute[186176]: 2026-02-16 17:25:35.844 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.349 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.357 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.357 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5009 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.358 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.359 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.359 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.377 186180 DEBUG nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.381 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.382 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.442 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.442 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.449 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.449 186180 INFO nova.compute.claims [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.572 186180 DEBUG nova.compute.provider_tree [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.586 186180 DEBUG nova.scheduler.client.report [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.610 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.611 186180 DEBUG nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.651 186180 DEBUG nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.651 186180 DEBUG nova.network.neutron [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.680 186180 INFO nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.694 186180 DEBUG nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.780 186180 DEBUG nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.781 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.782 186180 INFO nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Creating image(s)
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.782 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "/var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.783 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "/var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.783 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "/var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.800 186180 DEBUG nova.policy [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '04e81d9e145a466bbabfe4fdaf9f09aa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.803 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.867 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.869 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.869 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.880 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.941 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.942 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.973 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.974 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:36 compute-0 nova_compute[186176]: 2026-02-16 17:25:36.974 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.019 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.020 186180 DEBUG nova.virt.disk.api [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Checking if we can resize image /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.021 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.065 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.066 186180 DEBUG nova.virt.disk.api [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Cannot resize image /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.067 186180 DEBUG nova.objects.instance [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lazy-loading 'migration_context' on Instance uuid 7a804a24-fd5e-4882-be31-38bfdfa8c2c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.083 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.084 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Ensure instance console log exists: /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.084 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.085 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.085 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.281 186180 DEBUG nova.network.neutron [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Successfully created port: d83659d0-e89e-455a-91b4-462622d79d07 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.936 186180 DEBUG nova.network.neutron [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Successfully updated port: d83659d0-e89e-455a-91b4-462622d79d07 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.963 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "refresh_cache-7a804a24-fd5e-4882-be31-38bfdfa8c2c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.963 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquired lock "refresh_cache-7a804a24-fd5e-4882-be31-38bfdfa8c2c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:25:37 compute-0 nova_compute[186176]: 2026-02-16 17:25:37.964 186180 DEBUG nova.network.neutron [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:25:38 compute-0 nova_compute[186176]: 2026-02-16 17:25:38.034 186180 DEBUG nova.compute.manager [req-47ee62ac-a22c-400f-a89e-0e368db67991 req-35790122-5857-4442-838a-0e3decdc4371 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Received event network-changed-d83659d0-e89e-455a-91b4-462622d79d07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:25:38 compute-0 nova_compute[186176]: 2026-02-16 17:25:38.035 186180 DEBUG nova.compute.manager [req-47ee62ac-a22c-400f-a89e-0e368db67991 req-35790122-5857-4442-838a-0e3decdc4371 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Refreshing instance network info cache due to event network-changed-d83659d0-e89e-455a-91b4-462622d79d07. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:25:38 compute-0 nova_compute[186176]: 2026-02-16 17:25:38.035 186180 DEBUG oslo_concurrency.lockutils [req-47ee62ac-a22c-400f-a89e-0e368db67991 req-35790122-5857-4442-838a-0e3decdc4371 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-7a804a24-fd5e-4882-be31-38bfdfa8c2c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:25:38 compute-0 nova_compute[186176]: 2026-02-16 17:25:38.137 186180 DEBUG nova.network.neutron [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:25:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:38.152 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:38.153 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:38.154 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:38 compute-0 nova_compute[186176]: 2026-02-16 17:25:38.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.118 186180 DEBUG nova.network.neutron [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Updating instance_info_cache with network_info: [{"id": "d83659d0-e89e-455a-91b4-462622d79d07", "address": "fa:16:3e:35:08:e1", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83659d0-e8", "ovs_interfaceid": "d83659d0-e89e-455a-91b4-462622d79d07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.142 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Releasing lock "refresh_cache-7a804a24-fd5e-4882-be31-38bfdfa8c2c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.143 186180 DEBUG nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Instance network_info: |[{"id": "d83659d0-e89e-455a-91b4-462622d79d07", "address": "fa:16:3e:35:08:e1", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83659d0-e8", "ovs_interfaceid": "d83659d0-e89e-455a-91b4-462622d79d07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.144 186180 DEBUG oslo_concurrency.lockutils [req-47ee62ac-a22c-400f-a89e-0e368db67991 req-35790122-5857-4442-838a-0e3decdc4371 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-7a804a24-fd5e-4882-be31-38bfdfa8c2c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.145 186180 DEBUG nova.network.neutron [req-47ee62ac-a22c-400f-a89e-0e368db67991 req-35790122-5857-4442-838a-0e3decdc4371 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Refreshing network info cache for port d83659d0-e89e-455a-91b4-462622d79d07 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.149 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Start _get_guest_xml network_info=[{"id": "d83659d0-e89e-455a-91b4-462622d79d07", "address": "fa:16:3e:35:08:e1", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83659d0-e8", "ovs_interfaceid": "d83659d0-e89e-455a-91b4-462622d79d07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.155 186180 WARNING nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.160 186180 DEBUG nova.virt.libvirt.host [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.161 186180 DEBUG nova.virt.libvirt.host [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.168 186180 DEBUG nova.virt.libvirt.host [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.169 186180 DEBUG nova.virt.libvirt.host [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.170 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.171 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.171 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.171 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.172 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.172 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.172 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.172 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.173 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.173 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.173 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.174 186180 DEBUG nova.virt.hardware [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.177 186180 DEBUG nova.virt.libvirt.vif [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:25:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-179170323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-179170323',id=6,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-3no5x7ds',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:25:36Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=7a804a24-fd5e-4882-be31-38bfdfa8c2c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d83659d0-e89e-455a-91b4-462622d79d07", "address": "fa:16:3e:35:08:e1", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83659d0-e8", "ovs_interfaceid": "d83659d0-e89e-455a-91b4-462622d79d07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.178 186180 DEBUG nova.network.os_vif_util [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converting VIF {"id": "d83659d0-e89e-455a-91b4-462622d79d07", "address": "fa:16:3e:35:08:e1", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83659d0-e8", "ovs_interfaceid": "d83659d0-e89e-455a-91b4-462622d79d07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.179 186180 DEBUG nova.network.os_vif_util [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:08:e1,bridge_name='br-int',has_traffic_filtering=True,id=d83659d0-e89e-455a-91b4-462622d79d07,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83659d0-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.180 186180 DEBUG nova.objects.instance [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lazy-loading 'pci_devices' on Instance uuid 7a804a24-fd5e-4882-be31-38bfdfa8c2c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.193 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <uuid>7a804a24-fd5e-4882-be31-38bfdfa8c2c3</uuid>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <name>instance-00000006</name>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-179170323</nova:name>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:25:41</nova:creationTime>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:25:41 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:25:41 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:25:41 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:25:41 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:25:41 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:25:41 compute-0 nova_compute[186176]:         <nova:user uuid="04e81d9e145a466bbabfe4fdaf9f09aa">tempest-TestExecuteActionsViaActuator-900316824-project-member</nova:user>
Feb 16 17:25:41 compute-0 nova_compute[186176]:         <nova:project uuid="97a4c97daa7a495f91b4f65a132f7c0f">tempest-TestExecuteActionsViaActuator-900316824</nova:project>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:25:41 compute-0 nova_compute[186176]:         <nova:port uuid="d83659d0-e89e-455a-91b4-462622d79d07">
Feb 16 17:25:41 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <system>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <entry name="serial">7a804a24-fd5e-4882-be31-38bfdfa8c2c3</entry>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <entry name="uuid">7a804a24-fd5e-4882-be31-38bfdfa8c2c3</entry>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     </system>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <os>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   </os>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <features>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   </features>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk.config"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:35:08:e1"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <target dev="tapd83659d0-e8"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/console.log" append="off"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <video>
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     </video>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:25:41 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:25:41 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:25:41 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:25:41 compute-0 nova_compute[186176]: </domain>
Feb 16 17:25:41 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.194 186180 DEBUG nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Preparing to wait for external event network-vif-plugged-d83659d0-e89e-455a-91b4-462622d79d07 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.194 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.194 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.194 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.195 186180 DEBUG nova.virt.libvirt.vif [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:25:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-179170323',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-179170323',id=6,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-3no5x7ds',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:25:36Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=7a804a24-fd5e-4882-be31-38bfdfa8c2c3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d83659d0-e89e-455a-91b4-462622d79d07", "address": "fa:16:3e:35:08:e1", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83659d0-e8", "ovs_interfaceid": "d83659d0-e89e-455a-91b4-462622d79d07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.195 186180 DEBUG nova.network.os_vif_util [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converting VIF {"id": "d83659d0-e89e-455a-91b4-462622d79d07", "address": "fa:16:3e:35:08:e1", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83659d0-e8", "ovs_interfaceid": "d83659d0-e89e-455a-91b4-462622d79d07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.196 186180 DEBUG nova.network.os_vif_util [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:08:e1,bridge_name='br-int',has_traffic_filtering=True,id=d83659d0-e89e-455a-91b4-462622d79d07,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83659d0-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.196 186180 DEBUG os_vif [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:08:e1,bridge_name='br-int',has_traffic_filtering=True,id=d83659d0-e89e-455a-91b4-462622d79d07,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83659d0-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.197 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.197 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.198 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.201 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.201 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd83659d0-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.202 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd83659d0-e8, col_values=(('external_ids', {'iface-id': 'd83659d0-e89e-455a-91b4-462622d79d07', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:08:e1', 'vm-uuid': '7a804a24-fd5e-4882-be31-38bfdfa8c2c3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.203 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:41 compute-0 NetworkManager[56463]: <info>  [1771262741.2049] manager: (tapd83659d0-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.206 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.211 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.212 186180 INFO os_vif [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:08:e1,bridge_name='br-int',has_traffic_filtering=True,id=d83659d0-e89e-455a-91b4-462622d79d07,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83659d0-e8')
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.260 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.261 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.261 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] No VIF found with MAC fa:16:3e:35:08:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.261 186180 INFO nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Using config drive
Feb 16 17:25:41 compute-0 nova_compute[186176]: 2026-02-16 17:25:41.383 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.097 186180 INFO nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Creating config drive at /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk.config
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.104 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpwk2kx9fp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.231 186180 DEBUG oslo_concurrency.processutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpwk2kx9fp" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:25:42 compute-0 kernel: tapd83659d0-e8: entered promiscuous mode
Feb 16 17:25:42 compute-0 NetworkManager[56463]: <info>  [1771262742.3025] manager: (tapd83659d0-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Feb 16 17:25:42 compute-0 ovn_controller[96437]: 2026-02-16T17:25:42Z|00040|binding|INFO|Claiming lport d83659d0-e89e-455a-91b4-462622d79d07 for this chassis.
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.303 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:42 compute-0 ovn_controller[96437]: 2026-02-16T17:25:42Z|00041|binding|INFO|d83659d0-e89e-455a-91b4-462622d79d07: Claiming fa:16:3e:35:08:e1 10.100.0.6
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.313 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.312 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:08:e1 10.100.0.6'], port_security=['fa:16:3e:35:08:e1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7a804a24-fd5e-4882-be31-38bfdfa8c2c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a09b650a-b8da-4ec6-af84-f46bd29af7dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18cf134a-5b0b-4046-bb3d-fdfa0b081c31, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=d83659d0-e89e-455a-91b4-462622d79d07) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:25:42 compute-0 ovn_controller[96437]: 2026-02-16T17:25:42Z|00042|binding|INFO|Setting lport d83659d0-e89e-455a-91b4-462622d79d07 ovn-installed in OVS
Feb 16 17:25:42 compute-0 ovn_controller[96437]: 2026-02-16T17:25:42Z|00043|binding|INFO|Setting lport d83659d0-e89e-455a-91b4-462622d79d07 up in Southbound
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.314 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.315 105730 INFO neutron.agent.ovn.metadata.agent [-] Port d83659d0-e89e-455a-91b4-462622d79d07 in datapath 50b90e9d-0874-4370-ad17-1fff2c4cce15 bound to our chassis
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.317 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:25:42 compute-0 systemd-udevd[207585]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.334 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a36f22-7da4-4e9b-8bca-2eb077afd934]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:42 compute-0 systemd-machined[155631]: New machine qemu-3-instance-00000006.
Feb 16 17:25:42 compute-0 NetworkManager[56463]: <info>  [1771262742.3454] device (tapd83659d0-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:25:42 compute-0 NetworkManager[56463]: <info>  [1771262742.3462] device (tapd83659d0-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:25:42 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.361 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[5e393d32-6725-47fa-b50d-640488ef2dff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.366 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[389b28e7-3004-424f-be98-a7fce461f4b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.389 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[42959971-a9bf-4eb3-948f-f4c2d6bc17bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.402 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f61dd52b-4779-4b5c-a409-109f119a84a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b90e9d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:d8:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 616, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431264, 'reachable_time': 25027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207598, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.413 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[03740d09-8834-4034-91ba-56d19c005b29]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431271, 'tstamp': 431271}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207600, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431273, 'tstamp': 431273}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207600, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.416 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b90e9d-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.418 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.420 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.420 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b90e9d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.420 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.421 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap50b90e9d-00, col_values=(('external_ids', {'iface-id': '7e8ec4b7-6252-49aa-a342-59a2b0f3de95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:42.421 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.956 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262742.9555247, 7a804a24-fd5e-4882-be31-38bfdfa8c2c3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.956 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] VM Started (Lifecycle Event)
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.991 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.996 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262742.9558372, 7a804a24-fd5e-4882-be31-38bfdfa8c2c3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:25:42 compute-0 nova_compute[186176]: 2026-02-16 17:25:42.996 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] VM Paused (Lifecycle Event)
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.019 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.022 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.046 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.221 186180 DEBUG nova.compute.manager [req-6818c224-9a79-4561-8c19-fa253d0ff24b req-2c127c7f-e4a8-4022-8d91-642214057f39 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Received event network-vif-plugged-d83659d0-e89e-455a-91b4-462622d79d07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.222 186180 DEBUG oslo_concurrency.lockutils [req-6818c224-9a79-4561-8c19-fa253d0ff24b req-2c127c7f-e4a8-4022-8d91-642214057f39 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.222 186180 DEBUG oslo_concurrency.lockutils [req-6818c224-9a79-4561-8c19-fa253d0ff24b req-2c127c7f-e4a8-4022-8d91-642214057f39 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.223 186180 DEBUG oslo_concurrency.lockutils [req-6818c224-9a79-4561-8c19-fa253d0ff24b req-2c127c7f-e4a8-4022-8d91-642214057f39 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.223 186180 DEBUG nova.compute.manager [req-6818c224-9a79-4561-8c19-fa253d0ff24b req-2c127c7f-e4a8-4022-8d91-642214057f39 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Processing event network-vif-plugged-d83659d0-e89e-455a-91b4-462622d79d07 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.224 186180 DEBUG nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.229 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262743.2295828, 7a804a24-fd5e-4882-be31-38bfdfa8c2c3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.230 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] VM Resumed (Lifecycle Event)
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.234 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.238 186180 INFO nova.virt.libvirt.driver [-] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Instance spawned successfully.
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.239 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.270 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.273 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.307 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.311 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.311 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.312 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.312 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.312 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.312 186180 DEBUG nova.virt.libvirt.driver [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:25:43 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:43.351 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:25:43 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:43.352 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.371 186180 DEBUG nova.network.neutron [req-47ee62ac-a22c-400f-a89e-0e368db67991 req-35790122-5857-4442-838a-0e3decdc4371 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Updated VIF entry in instance network info cache for port d83659d0-e89e-455a-91b4-462622d79d07. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.372 186180 DEBUG nova.network.neutron [req-47ee62ac-a22c-400f-a89e-0e368db67991 req-35790122-5857-4442-838a-0e3decdc4371 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Updating instance_info_cache with network_info: [{"id": "d83659d0-e89e-455a-91b4-462622d79d07", "address": "fa:16:3e:35:08:e1", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83659d0-e8", "ovs_interfaceid": "d83659d0-e89e-455a-91b4-462622d79d07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.389 186180 INFO nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Took 6.61 seconds to spawn the instance on the hypervisor.
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.390 186180 DEBUG nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.390 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.399 186180 DEBUG oslo_concurrency.lockutils [req-47ee62ac-a22c-400f-a89e-0e368db67991 req-35790122-5857-4442-838a-0e3decdc4371 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-7a804a24-fd5e-4882-be31-38bfdfa8c2c3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.439 186180 INFO nova.compute.manager [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Took 7.01 seconds to build instance.
Feb 16 17:25:43 compute-0 nova_compute[186176]: 2026-02-16 17:25:43.452 186180 DEBUG oslo_concurrency.lockutils [None req-81962588-5f0d-442a-9b9f-56c6c016b25d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:45 compute-0 nova_compute[186176]: 2026-02-16 17:25:45.439 186180 DEBUG nova.compute.manager [req-fbe99243-f4d4-4639-874f-549a552414d3 req-b19989fe-a03a-4252-a2d6-5980ba4df515 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Received event network-vif-plugged-d83659d0-e89e-455a-91b4-462622d79d07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:25:45 compute-0 nova_compute[186176]: 2026-02-16 17:25:45.440 186180 DEBUG oslo_concurrency.lockutils [req-fbe99243-f4d4-4639-874f-549a552414d3 req-b19989fe-a03a-4252-a2d6-5980ba4df515 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:45 compute-0 nova_compute[186176]: 2026-02-16 17:25:45.440 186180 DEBUG oslo_concurrency.lockutils [req-fbe99243-f4d4-4639-874f-549a552414d3 req-b19989fe-a03a-4252-a2d6-5980ba4df515 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:45 compute-0 nova_compute[186176]: 2026-02-16 17:25:45.440 186180 DEBUG oslo_concurrency.lockutils [req-fbe99243-f4d4-4639-874f-549a552414d3 req-b19989fe-a03a-4252-a2d6-5980ba4df515 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:45 compute-0 nova_compute[186176]: 2026-02-16 17:25:45.441 186180 DEBUG nova.compute.manager [req-fbe99243-f4d4-4639-874f-549a552414d3 req-b19989fe-a03a-4252-a2d6-5980ba4df515 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] No waiting events found dispatching network-vif-plugged-d83659d0-e89e-455a-91b4-462622d79d07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:25:45 compute-0 nova_compute[186176]: 2026-02-16 17:25:45.441 186180 WARNING nova.compute.manager [req-fbe99243-f4d4-4639-874f-549a552414d3 req-b19989fe-a03a-4252-a2d6-5980ba4df515 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Received unexpected event network-vif-plugged-d83659d0-e89e-455a-91b4-462622d79d07 for instance with vm_state active and task_state None.
Feb 16 17:25:46 compute-0 nova_compute[186176]: 2026-02-16 17:25:46.205 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:46 compute-0 nova_compute[186176]: 2026-02-16 17:25:46.386 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:51 compute-0 nova_compute[186176]: 2026-02-16 17:25:51.209 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:51 compute-0 nova_compute[186176]: 2026-02-16 17:25:51.388 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:53 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:25:53.355 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:25:54 compute-0 podman[207622]: 2026-02-16 17:25:54.107320503 +0000 UTC m=+0.068754967 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, release=1770267347, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, version=9.7, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 17:25:54 compute-0 ovn_controller[96437]: 2026-02-16T17:25:54Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:08:e1 10.100.0.6
Feb 16 17:25:54 compute-0 ovn_controller[96437]: 2026-02-16T17:25:54Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:08:e1 10.100.0.6
Feb 16 17:25:56 compute-0 nova_compute[186176]: 2026-02-16 17:25:56.212 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:56 compute-0 nova_compute[186176]: 2026-02-16 17:25:56.390 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:25:57 compute-0 nova_compute[186176]: 2026-02-16 17:25:57.705 186180 DEBUG nova.virt.libvirt.driver [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Creating tmpfile /var/lib/nova/instances/tmpdtto8_r6 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 17:25:57 compute-0 nova_compute[186176]: 2026-02-16 17:25:57.843 186180 DEBUG nova.compute.manager [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdtto8_r6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 17:25:57 compute-0 nova_compute[186176]: 2026-02-16 17:25:57.877 186180 DEBUG nova.compute.manager [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560
Feb 16 17:25:57 compute-0 nova_compute[186176]: 2026-02-16 17:25:57.954 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:25:57 compute-0 nova_compute[186176]: 2026-02-16 17:25:57.955 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:25:57 compute-0 nova_compute[186176]: 2026-02-16 17:25:57.985 186180 DEBUG nova.objects.instance [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0d215b2a-91a9-4d0b-a04e-1355b877179d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.009 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.010 186180 INFO nova.compute.claims [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.010 186180 DEBUG nova.objects.instance [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'resources' on Instance uuid 0d215b2a-91a9-4d0b-a04e-1355b877179d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.042 186180 DEBUG nova.objects.instance [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'numa_topology' on Instance uuid 0d215b2a-91a9-4d0b-a04e-1355b877179d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.062 186180 DEBUG nova.objects.instance [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0d215b2a-91a9-4d0b-a04e-1355b877179d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:25:58 compute-0 podman[207644]: 2026-02-16 17:25:58.086708948 +0000 UTC m=+0.055206316 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.109 186180 INFO nova.compute.resource_tracker [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Updating resource usage from migration 8fdb585f-1482-43fc-8ed2-b4ffd4ce38c9
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.110 186180 DEBUG nova.compute.resource_tracker [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Starting to track incoming migration 8fdb585f-1482-43fc-8ed2-b4ffd4ce38c9 with flavor 75ce9d90-876f-4652-a61c-f74d306b6692 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.245 186180 DEBUG nova.compute.provider_tree [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.267 186180 DEBUG nova.scheduler.client.report [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.297 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.342s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.298 186180 INFO nova.compute.manager [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Migrating
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.751 186180 DEBUG nova.compute.manager [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdtto8_r6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='158c51e6-71fc-497d-9677-0db04ae83881',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.813 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-158c51e6-71fc-497d-9677-0db04ae83881" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.813 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-158c51e6-71fc-497d-9677-0db04ae83881" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:25:58 compute-0 nova_compute[186176]: 2026-02-16 17:25:58.814 186180 DEBUG nova.network.neutron [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:25:59 compute-0 podman[195505]: time="2026-02-16T17:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:25:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:25:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2633 "" "Go-http-client/1.1"
Feb 16 17:25:59 compute-0 sshd-session[207664]: Accepted publickey for nova from 192.168.122.101 port 58036 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:25:59 compute-0 systemd-logind[821]: New session 27 of user nova.
Feb 16 17:25:59 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 17:25:59 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 17:25:59 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 17:25:59 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 17:25:59 compute-0 systemd[207668]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:26:00 compute-0 systemd[207668]: Queued start job for default target Main User Target.
Feb 16 17:26:00 compute-0 systemd[207668]: Created slice User Application Slice.
Feb 16 17:26:00 compute-0 systemd[207668]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:26:00 compute-0 systemd[207668]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 17:26:00 compute-0 systemd[207668]: Reached target Paths.
Feb 16 17:26:00 compute-0 systemd[207668]: Reached target Timers.
Feb 16 17:26:00 compute-0 systemd[207668]: Starting D-Bus User Message Bus Socket...
Feb 16 17:26:00 compute-0 systemd[207668]: Starting Create User's Volatile Files and Directories...
Feb 16 17:26:00 compute-0 systemd[207668]: Finished Create User's Volatile Files and Directories.
Feb 16 17:26:00 compute-0 systemd[207668]: Listening on D-Bus User Message Bus Socket.
Feb 16 17:26:00 compute-0 systemd[207668]: Reached target Sockets.
Feb 16 17:26:00 compute-0 systemd[207668]: Reached target Basic System.
Feb 16 17:26:00 compute-0 systemd[207668]: Reached target Main User Target.
Feb 16 17:26:00 compute-0 systemd[207668]: Startup finished in 116ms.
Feb 16 17:26:00 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 17:26:00 compute-0 systemd[1]: Started Session 27 of User nova.
Feb 16 17:26:00 compute-0 sshd-session[207664]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:26:00 compute-0 sshd-session[207684]: Received disconnect from 192.168.122.101 port 58036:11: disconnected by user
Feb 16 17:26:00 compute-0 sshd-session[207684]: Disconnected from user nova 192.168.122.101 port 58036
Feb 16 17:26:00 compute-0 sshd-session[207664]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:26:00 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Feb 16 17:26:00 compute-0 systemd-logind[821]: Session 27 logged out. Waiting for processes to exit.
Feb 16 17:26:00 compute-0 systemd-logind[821]: Removed session 27.
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.164 186180 DEBUG nova.network.neutron [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Updating instance_info_cache with network_info: [{"id": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "address": "fa:16:3e:f2:13:78", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41b5d6ce-d6", "ovs_interfaceid": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.193 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-158c51e6-71fc-497d-9677-0db04ae83881" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.197 186180 DEBUG nova.virt.libvirt.driver [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdtto8_r6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='158c51e6-71fc-497d-9677-0db04ae83881',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.198 186180 DEBUG nova.virt.libvirt.driver [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Creating instance directory: /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.199 186180 DEBUG nova.virt.libvirt.driver [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Creating disk.info with the contents: {'/var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk': 'qcow2', '/var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.200 186180 DEBUG nova.virt.libvirt.driver [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.201 186180 DEBUG nova.objects.instance [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 158c51e6-71fc-497d-9677-0db04ae83881 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.242 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:26:00 compute-0 sshd-session[207686]: Accepted publickey for nova from 192.168.122.101 port 58038 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:26:00 compute-0 systemd-logind[821]: New session 29 of user nova.
Feb 16 17:26:00 compute-0 systemd[1]: Started Session 29 of User nova.
Feb 16 17:26:00 compute-0 sshd-session[207686]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.303 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.304 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.305 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.316 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:26:00 compute-0 sshd-session[207690]: Received disconnect from 192.168.122.101 port 58038:11: disconnected by user
Feb 16 17:26:00 compute-0 sshd-session[207690]: Disconnected from user nova 192.168.122.101 port 58038
Feb 16 17:26:00 compute-0 sshd-session[207686]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:26:00 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Feb 16 17:26:00 compute-0 systemd-logind[821]: Session 29 logged out. Waiting for processes to exit.
Feb 16 17:26:00 compute-0 systemd-logind[821]: Removed session 29.
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.363 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.364 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.397 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.399 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.399 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.466 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.468 186180 DEBUG nova.virt.disk.api [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Checking if we can resize image /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.468 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.527 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.529 186180 DEBUG nova.virt.disk.api [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Cannot resize image /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.529 186180 DEBUG nova.objects.instance [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid 158c51e6-71fc-497d-9677-0db04ae83881 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.545 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.570 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk.config 485376" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.574 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk.config to /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.574 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk.config /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.957 186180 DEBUG oslo_concurrency.processutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881/disk.config /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881" returned: 0 in 0.383s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.958 186180 DEBUG nova.virt.libvirt.driver [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.959 186180 DEBUG nova.virt.libvirt.vif [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:24:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1670663905',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1670663905',id=3,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:25:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-q3y1lfby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:25:03Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=158c51e6-71fc-497d-9677-0db04ae83881,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "address": "fa:16:3e:f2:13:78", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap41b5d6ce-d6", "ovs_interfaceid": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.959 186180 DEBUG nova.network.os_vif_util [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "address": "fa:16:3e:f2:13:78", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap41b5d6ce-d6", "ovs_interfaceid": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.960 186180 DEBUG nova.network.os_vif_util [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:13:78,bridge_name='br-int',has_traffic_filtering=True,id=41b5d6ce-d60c-4a88-8387-dca85adb1373,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41b5d6ce-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.960 186180 DEBUG os_vif [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:13:78,bridge_name='br-int',has_traffic_filtering=True,id=41b5d6ce-d60c-4a88-8387-dca85adb1373,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41b5d6ce-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.961 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.961 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.962 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.964 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.965 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41b5d6ce-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:00 compute-0 nova_compute[186176]: 2026-02-16 17:26:00.965 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap41b5d6ce-d6, col_values=(('external_ids', {'iface-id': '41b5d6ce-d60c-4a88-8387-dca85adb1373', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:13:78', 'vm-uuid': '158c51e6-71fc-497d-9677-0db04ae83881'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:01 compute-0 NetworkManager[56463]: <info>  [1771262761.0011] manager: (tap41b5d6ce-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Feb 16 17:26:01 compute-0 nova_compute[186176]: 2026-02-16 17:26:01.000 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:01 compute-0 nova_compute[186176]: 2026-02-16 17:26:01.004 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:26:01 compute-0 nova_compute[186176]: 2026-02-16 17:26:01.010 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:01 compute-0 nova_compute[186176]: 2026-02-16 17:26:01.011 186180 INFO os_vif [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:13:78,bridge_name='br-int',has_traffic_filtering=True,id=41b5d6ce-d60c-4a88-8387-dca85adb1373,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41b5d6ce-d6')
Feb 16 17:26:01 compute-0 nova_compute[186176]: 2026-02-16 17:26:01.012 186180 DEBUG nova.virt.libvirt.driver [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 17:26:01 compute-0 nova_compute[186176]: 2026-02-16 17:26:01.012 186180 DEBUG nova.compute.manager [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdtto8_r6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='158c51e6-71fc-497d-9677-0db04ae83881',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 17:26:01 compute-0 nova_compute[186176]: 2026-02-16 17:26:01.393 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:01 compute-0 openstack_network_exporter[198360]: ERROR   17:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:26:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:26:01 compute-0 openstack_network_exporter[198360]: ERROR   17:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:26:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:26:03 compute-0 sshd-session[207713]: Accepted publickey for nova from 192.168.122.101 port 58054 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:26:03 compute-0 systemd-logind[821]: New session 30 of user nova.
Feb 16 17:26:03 compute-0 systemd[1]: Started Session 30 of User nova.
Feb 16 17:26:03 compute-0 sshd-session[207713]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:26:03 compute-0 sshd-session[207716]: Received disconnect from 192.168.122.101 port 58054:11: disconnected by user
Feb 16 17:26:03 compute-0 sshd-session[207716]: Disconnected from user nova 192.168.122.101 port 58054
Feb 16 17:26:03 compute-0 sshd-session[207713]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:26:03 compute-0 systemd[1]: session-30.scope: Deactivated successfully.
Feb 16 17:26:03 compute-0 systemd-logind[821]: Session 30 logged out. Waiting for processes to exit.
Feb 16 17:26:03 compute-0 systemd-logind[821]: Removed session 30.
Feb 16 17:26:04 compute-0 sshd-session[207718]: Accepted publickey for nova from 192.168.122.101 port 58068 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:26:04 compute-0 systemd-logind[821]: New session 31 of user nova.
Feb 16 17:26:04 compute-0 systemd[1]: Started Session 31 of User nova.
Feb 16 17:26:04 compute-0 sshd-session[207718]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:26:04 compute-0 sshd-session[207722]: Received disconnect from 192.168.122.101 port 58068:11: disconnected by user
Feb 16 17:26:04 compute-0 sshd-session[207722]: Disconnected from user nova 192.168.122.101 port 58068
Feb 16 17:26:04 compute-0 sshd-session[207718]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:26:04 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Feb 16 17:26:04 compute-0 systemd-logind[821]: Session 31 logged out. Waiting for processes to exit.
Feb 16 17:26:04 compute-0 podman[207721]: 2026-02-16 17:26:04.224165329 +0000 UTC m=+0.086049328 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:26:04 compute-0 systemd-logind[821]: Removed session 31.
Feb 16 17:26:04 compute-0 podman[207748]: 2026-02-16 17:26:04.324062493 +0000 UTC m=+0.081607099 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 17:26:04 compute-0 sshd-session[207749]: Accepted publickey for nova from 192.168.122.101 port 58078 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:26:04 compute-0 systemd-logind[821]: New session 32 of user nova.
Feb 16 17:26:04 compute-0 systemd[1]: Started Session 32 of User nova.
Feb 16 17:26:04 compute-0 sshd-session[207749]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:26:04 compute-0 sshd-session[207779]: Received disconnect from 192.168.122.101 port 58078:11: disconnected by user
Feb 16 17:26:04 compute-0 sshd-session[207779]: Disconnected from user nova 192.168.122.101 port 58078
Feb 16 17:26:04 compute-0 sshd-session[207749]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:26:04 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Feb 16 17:26:04 compute-0 systemd-logind[821]: Session 32 logged out. Waiting for processes to exit.
Feb 16 17:26:04 compute-0 systemd-logind[821]: Removed session 32.
Feb 16 17:26:04 compute-0 nova_compute[186176]: 2026-02-16 17:26:04.699 186180 DEBUG nova.compute.manager [req-5cda81d7-c3b5-498e-a7c8-0b3a156a8c82 req-4e4add94-1de7-4ebf-95cb-715266fc8123 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received event network-vif-unplugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:04 compute-0 nova_compute[186176]: 2026-02-16 17:26:04.700 186180 DEBUG oslo_concurrency.lockutils [req-5cda81d7-c3b5-498e-a7c8-0b3a156a8c82 req-4e4add94-1de7-4ebf-95cb-715266fc8123 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:04 compute-0 nova_compute[186176]: 2026-02-16 17:26:04.701 186180 DEBUG oslo_concurrency.lockutils [req-5cda81d7-c3b5-498e-a7c8-0b3a156a8c82 req-4e4add94-1de7-4ebf-95cb-715266fc8123 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:04 compute-0 nova_compute[186176]: 2026-02-16 17:26:04.701 186180 DEBUG oslo_concurrency.lockutils [req-5cda81d7-c3b5-498e-a7c8-0b3a156a8c82 req-4e4add94-1de7-4ebf-95cb-715266fc8123 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:04 compute-0 nova_compute[186176]: 2026-02-16 17:26:04.701 186180 DEBUG nova.compute.manager [req-5cda81d7-c3b5-498e-a7c8-0b3a156a8c82 req-4e4add94-1de7-4ebf-95cb-715266fc8123 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] No waiting events found dispatching network-vif-unplugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:04 compute-0 nova_compute[186176]: 2026-02-16 17:26:04.702 186180 WARNING nova.compute.manager [req-5cda81d7-c3b5-498e-a7c8-0b3a156a8c82 req-4e4add94-1de7-4ebf-95cb-715266fc8123 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received unexpected event network-vif-unplugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb for instance with vm_state active and task_state resize_migrating.
Feb 16 17:26:05 compute-0 nova_compute[186176]: 2026-02-16 17:26:05.212 186180 DEBUG nova.network.neutron [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Port 41b5d6ce-d60c-4a88-8387-dca85adb1373 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 17:26:05 compute-0 nova_compute[186176]: 2026-02-16 17:26:05.214 186180 DEBUG nova.compute.manager [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpdtto8_r6',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='158c51e6-71fc-497d-9677-0db04ae83881',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 17:26:05 compute-0 nova_compute[186176]: 2026-02-16 17:26:05.348 186180 INFO nova.network.neutron [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Updating port b22b4c3f-9ce1-4cf9-ace4-601e07d884bb with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Feb 16 17:26:05 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 17:26:05 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 17:26:05 compute-0 kernel: tap41b5d6ce-d6: entered promiscuous mode
Feb 16 17:26:05 compute-0 NetworkManager[56463]: <info>  [1771262765.5822] manager: (tap41b5d6ce-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Feb 16 17:26:05 compute-0 nova_compute[186176]: 2026-02-16 17:26:05.582 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:05 compute-0 ovn_controller[96437]: 2026-02-16T17:26:05Z|00044|binding|INFO|Claiming lport 41b5d6ce-d60c-4a88-8387-dca85adb1373 for this additional chassis.
Feb 16 17:26:05 compute-0 ovn_controller[96437]: 2026-02-16T17:26:05Z|00045|binding|INFO|41b5d6ce-d60c-4a88-8387-dca85adb1373: Claiming fa:16:3e:f2:13:78 10.100.0.12
Feb 16 17:26:05 compute-0 ovn_controller[96437]: 2026-02-16T17:26:05Z|00046|binding|INFO|Setting lport 41b5d6ce-d60c-4a88-8387-dca85adb1373 ovn-installed in OVS
Feb 16 17:26:05 compute-0 nova_compute[186176]: 2026-02-16 17:26:05.588 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:05 compute-0 nova_compute[186176]: 2026-02-16 17:26:05.594 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:05 compute-0 systemd-udevd[207813]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:26:05 compute-0 systemd-machined[155631]: New machine qemu-4-instance-00000003.
Feb 16 17:26:05 compute-0 NetworkManager[56463]: <info>  [1771262765.6198] device (tap41b5d6ce-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:26:05 compute-0 NetworkManager[56463]: <info>  [1771262765.6204] device (tap41b5d6ce-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:26:05 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000003.
Feb 16 17:26:06 compute-0 nova_compute[186176]: 2026-02-16 17:26:06.048 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:06 compute-0 nova_compute[186176]: 2026-02-16 17:26:06.395 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.356 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-0d215b2a-91a9-4d0b-a04e-1355b877179d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.356 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-0d215b2a-91a9-4d0b-a04e-1355b877179d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.356 186180 DEBUG nova.network.neutron [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.465 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262767.4646554, 158c51e6-71fc-497d-9677-0db04ae83881 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.466 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] VM Started (Lifecycle Event)
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.516 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.583 186180 DEBUG nova.compute.manager [req-89441093-d28a-4bbd-942a-9f6756ee067f req-8963d8a7-c3d1-4096-af8a-cc093dbee10b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received event network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.584 186180 DEBUG oslo_concurrency.lockutils [req-89441093-d28a-4bbd-942a-9f6756ee067f req-8963d8a7-c3d1-4096-af8a-cc093dbee10b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.584 186180 DEBUG oslo_concurrency.lockutils [req-89441093-d28a-4bbd-942a-9f6756ee067f req-8963d8a7-c3d1-4096-af8a-cc093dbee10b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.585 186180 DEBUG oslo_concurrency.lockutils [req-89441093-d28a-4bbd-942a-9f6756ee067f req-8963d8a7-c3d1-4096-af8a-cc093dbee10b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.585 186180 DEBUG nova.compute.manager [req-89441093-d28a-4bbd-942a-9f6756ee067f req-8963d8a7-c3d1-4096-af8a-cc093dbee10b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] No waiting events found dispatching network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.585 186180 WARNING nova.compute.manager [req-89441093-d28a-4bbd-942a-9f6756ee067f req-8963d8a7-c3d1-4096-af8a-cc093dbee10b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received unexpected event network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb for instance with vm_state active and task_state resize_migrated.
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.614 186180 DEBUG nova.compute.manager [req-bd9f0682-a0e0-4c19-92db-8b3d16217a61 req-eb5693e1-7142-4d5c-bf3b-9708aa8dc7da 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received event network-changed-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.614 186180 DEBUG nova.compute.manager [req-bd9f0682-a0e0-4c19-92db-8b3d16217a61 req-eb5693e1-7142-4d5c-bf3b-9708aa8dc7da 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Refreshing instance network info cache due to event network-changed-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:26:07 compute-0 nova_compute[186176]: 2026-02-16 17:26:07.614 186180 DEBUG oslo_concurrency.lockutils [req-bd9f0682-a0e0-4c19-92db-8b3d16217a61 req-eb5693e1-7142-4d5c-bf3b-9708aa8dc7da 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-0d215b2a-91a9-4d0b-a04e-1355b877179d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:26:08 compute-0 nova_compute[186176]: 2026-02-16 17:26:08.245 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262768.245361, 158c51e6-71fc-497d-9677-0db04ae83881 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:26:08 compute-0 nova_compute[186176]: 2026-02-16 17:26:08.246 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] VM Resumed (Lifecycle Event)
Feb 16 17:26:08 compute-0 nova_compute[186176]: 2026-02-16 17:26:08.267 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:26:08 compute-0 nova_compute[186176]: 2026-02-16 17:26:08.272 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:26:08 compute-0 nova_compute[186176]: 2026-02-16 17:26:08.299 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 17:26:10 compute-0 ovn_controller[96437]: 2026-02-16T17:26:10Z|00047|binding|INFO|Claiming lport 41b5d6ce-d60c-4a88-8387-dca85adb1373 for this chassis.
Feb 16 17:26:10 compute-0 ovn_controller[96437]: 2026-02-16T17:26:10Z|00048|binding|INFO|41b5d6ce-d60c-4a88-8387-dca85adb1373: Claiming fa:16:3e:f2:13:78 10.100.0.12
Feb 16 17:26:10 compute-0 ovn_controller[96437]: 2026-02-16T17:26:10Z|00049|binding|INFO|Setting lport 41b5d6ce-d60c-4a88-8387-dca85adb1373 up in Southbound
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.041 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:13:78 10.100.0.12'], port_security=['fa:16:3e:f2:13:78 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '158c51e6-71fc-497d-9677-0db04ae83881', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'a09b650a-b8da-4ec6-af84-f46bd29af7dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18cf134a-5b0b-4046-bb3d-fdfa0b081c31, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=41b5d6ce-d60c-4a88-8387-dca85adb1373) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.043 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 41b5d6ce-d60c-4a88-8387-dca85adb1373 in datapath 50b90e9d-0874-4370-ad17-1fff2c4cce15 bound to our chassis
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.044 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.064 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[46c0fe9b-9912-4ee3-a46a-4c6df7d1e4f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.088 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[0c93a50a-b3d4-4479-8888-3ce3561674da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.092 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[1901b58a-239d-4200-a2ba-5c7ac61247a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.115 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[4f23050d-b8fe-4137-a5a9-8b2562da53a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.128 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[4227688a-f464-4a27-b0b1-9f638a5aa0ce]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b90e9d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:d8:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 8, 'rx_bytes': 910, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 8, 'rx_bytes': 910, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431264, 'reachable_time': 25027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207852, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.142 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[460637f2-42b6-4630-a9dd-fa035fe6f86a]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431271, 'tstamp': 431271}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207853, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431273, 'tstamp': 431273}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207853, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.144 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b90e9d-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.145 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.147 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b90e9d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.147 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.147 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap50b90e9d-00, col_values=(('external_ids', {'iface-id': '7e8ec4b7-6252-49aa-a342-59a2b0f3de95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:10 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:10.148 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.291 186180 INFO nova.compute.manager [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Post operation of migration started
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.507 186180 DEBUG nova.network.neutron [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Updating instance_info_cache with network_info: [{"id": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "address": "fa:16:3e:dd:98:36", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb22b4c3f-9c", "ovs_interfaceid": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.529 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-0d215b2a-91a9-4d0b-a04e-1355b877179d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.534 186180 DEBUG oslo_concurrency.lockutils [req-bd9f0682-a0e0-4c19-92db-8b3d16217a61 req-eb5693e1-7142-4d5c-bf3b-9708aa8dc7da 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-0d215b2a-91a9-4d0b-a04e-1355b877179d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.534 186180 DEBUG nova.network.neutron [req-bd9f0682-a0e0-4c19-92db-8b3d16217a61 req-eb5693e1-7142-4d5c-bf3b-9708aa8dc7da 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Refreshing network info cache for port b22b4c3f-9ce1-4cf9-ace4-601e07d884bb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.666 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.668 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.668 186180 INFO nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Creating image(s)
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.669 186180 DEBUG nova.objects.instance [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0d215b2a-91a9-4d0b-a04e-1355b877179d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.687 186180 DEBUG oslo_concurrency.processutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.733 186180 DEBUG oslo_concurrency.processutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.734 186180 DEBUG nova.virt.disk.api [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Checking if we can resize image /var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.734 186180 DEBUG oslo_concurrency.processutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.759 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-158c51e6-71fc-497d-9677-0db04ae83881" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.760 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-158c51e6-71fc-497d-9677-0db04ae83881" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.760 186180 DEBUG nova.network.neutron [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.782 186180 DEBUG oslo_concurrency.processutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.782 186180 DEBUG nova.virt.disk.api [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Cannot resize image /var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.828 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.829 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Ensure instance console log exists: /var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.829 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.830 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.830 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.833 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Start _get_guest_xml network_info=[{"id": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "address": "fa:16:3e:dd:98:36", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "vif_mac": "fa:16:3e:dd:98:36"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb22b4c3f-9c", "ovs_interfaceid": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.838 186180 WARNING nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.845 186180 DEBUG nova.virt.libvirt.host [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.846 186180 DEBUG nova.virt.libvirt.host [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.849 186180 DEBUG nova.virt.libvirt.host [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.849 186180 DEBUG nova.virt.libvirt.host [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.851 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.851 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.851 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.852 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.852 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.852 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.852 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.853 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.855 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.855 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.856 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.856 186180 DEBUG nova.virt.hardware [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.856 186180 DEBUG nova.objects.instance [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0d215b2a-91a9-4d0b-a04e-1355b877179d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.878 186180 DEBUG oslo_concurrency.processutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.930 186180 DEBUG oslo_concurrency.processutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk.config --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.931 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "/var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.932 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "/var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.933 186180 DEBUG oslo_concurrency.lockutils [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "/var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.934 186180 DEBUG nova.virt.libvirt.vif [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:25:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-892323844',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-892323844',id=5,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:25:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-kkz23cbt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:26:04Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=0d215b2a-91a9-4d0b-a04e-1355b877179d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "address": "fa:16:3e:dd:98:36", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "vif_mac": "fa:16:3e:dd:98:36"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb22b4c3f-9c", "ovs_interfaceid": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.934 186180 DEBUG nova.network.os_vif_util [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "address": "fa:16:3e:dd:98:36", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "vif_mac": "fa:16:3e:dd:98:36"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb22b4c3f-9c", "ovs_interfaceid": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.935 186180 DEBUG nova.network.os_vif_util [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:98:36,bridge_name='br-int',has_traffic_filtering=True,id=b22b4c3f-9ce1-4cf9-ace4-601e07d884bb,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb22b4c3f-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.937 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <uuid>0d215b2a-91a9-4d0b-a04e-1355b877179d</uuid>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <name>instance-00000005</name>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-892323844</nova:name>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:26:10</nova:creationTime>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:26:10 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:26:10 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:26:10 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:26:10 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:26:10 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:26:10 compute-0 nova_compute[186176]:         <nova:user uuid="04e81d9e145a466bbabfe4fdaf9f09aa">tempest-TestExecuteActionsViaActuator-900316824-project-member</nova:user>
Feb 16 17:26:10 compute-0 nova_compute[186176]:         <nova:project uuid="97a4c97daa7a495f91b4f65a132f7c0f">tempest-TestExecuteActionsViaActuator-900316824</nova:project>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:26:10 compute-0 nova_compute[186176]:         <nova:port uuid="b22b4c3f-9ce1-4cf9-ace4-601e07d884bb">
Feb 16 17:26:10 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <system>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <entry name="serial">0d215b2a-91a9-4d0b-a04e-1355b877179d</entry>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <entry name="uuid">0d215b2a-91a9-4d0b-a04e-1355b877179d</entry>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     </system>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <os>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   </os>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <features>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   </features>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/disk.config"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:dd:98:36"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <target dev="tapb22b4c3f-9c"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d/console.log" append="off"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <video>
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     </video>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:26:10 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:26:10 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:26:10 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:26:10 compute-0 nova_compute[186176]: </domain>
Feb 16 17:26:10 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.938 186180 DEBUG nova.virt.libvirt.vif [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:25:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-892323844',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-892323844',id=5,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:25:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-kkz23cbt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:26:04Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=0d215b2a-91a9-4d0b-a04e-1355b877179d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "address": "fa:16:3e:dd:98:36", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "vif_mac": "fa:16:3e:dd:98:36"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb22b4c3f-9c", "ovs_interfaceid": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.938 186180 DEBUG nova.network.os_vif_util [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "address": "fa:16:3e:dd:98:36", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "vif_mac": "fa:16:3e:dd:98:36"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb22b4c3f-9c", "ovs_interfaceid": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.939 186180 DEBUG nova.network.os_vif_util [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:98:36,bridge_name='br-int',has_traffic_filtering=True,id=b22b4c3f-9ce1-4cf9-ace4-601e07d884bb,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb22b4c3f-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.939 186180 DEBUG os_vif [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:98:36,bridge_name='br-int',has_traffic_filtering=True,id=b22b4c3f-9ce1-4cf9-ace4-601e07d884bb,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb22b4c3f-9c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.940 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.940 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.941 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.943 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.943 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb22b4c3f-9c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.944 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb22b4c3f-9c, col_values=(('external_ids', {'iface-id': 'b22b4c3f-9ce1-4cf9-ace4-601e07d884bb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:98:36', 'vm-uuid': '0d215b2a-91a9-4d0b-a04e-1355b877179d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.946 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:10 compute-0 NetworkManager[56463]: <info>  [1771262770.9470] manager: (tapb22b4c3f-9c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.948 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.951 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:10 compute-0 nova_compute[186176]: 2026-02-16 17:26:10.952 186180 INFO os_vif [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:98:36,bridge_name='br-int',has_traffic_filtering=True,id=b22b4c3f-9ce1-4cf9-ace4-601e07d884bb,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb22b4c3f-9c')
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.010 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.010 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.010 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] No VIF found with MAC fa:16:3e:dd:98:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.011 186180 INFO nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Using config drive
Feb 16 17:26:11 compute-0 kernel: tapb22b4c3f-9c: entered promiscuous mode
Feb 16 17:26:11 compute-0 NetworkManager[56463]: <info>  [1771262771.0520] manager: (tapb22b4c3f-9c): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.053 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:11 compute-0 ovn_controller[96437]: 2026-02-16T17:26:11Z|00050|binding|INFO|Claiming lport b22b4c3f-9ce1-4cf9-ace4-601e07d884bb for this chassis.
Feb 16 17:26:11 compute-0 ovn_controller[96437]: 2026-02-16T17:26:11Z|00051|binding|INFO|b22b4c3f-9ce1-4cf9-ace4-601e07d884bb: Claiming fa:16:3e:dd:98:36 10.100.0.13
Feb 16 17:26:11 compute-0 ovn_controller[96437]: 2026-02-16T17:26:11Z|00052|binding|INFO|Setting lport b22b4c3f-9ce1-4cf9-ace4-601e07d884bb ovn-installed in OVS
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.060 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:11 compute-0 systemd-machined[155631]: New machine qemu-5-instance-00000005.
Feb 16 17:26:11 compute-0 systemd-udevd[207881]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:26:11 compute-0 NetworkManager[56463]: <info>  [1771262771.0842] device (tapb22b4c3f-9c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:26:11 compute-0 NetworkManager[56463]: <info>  [1771262771.0848] device (tapb22b4c3f-9c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:26:11 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.141 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:98:36 10.100.0.13'], port_security=['fa:16:3e:dd:98:36 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0d215b2a-91a9-4d0b-a04e-1355b877179d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a09b650a-b8da-4ec6-af84-f46bd29af7dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18cf134a-5b0b-4046-bb3d-fdfa0b081c31, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=b22b4c3f-9ce1-4cf9-ace4-601e07d884bb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:26:11 compute-0 ovn_controller[96437]: 2026-02-16T17:26:11Z|00053|binding|INFO|Setting lport b22b4c3f-9ce1-4cf9-ace4-601e07d884bb up in Southbound
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.142 105730 INFO neutron.agent.ovn.metadata.agent [-] Port b22b4c3f-9ce1-4cf9-ace4-601e07d884bb in datapath 50b90e9d-0874-4370-ad17-1fff2c4cce15 bound to our chassis
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.144 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.158 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[58bffd23-117d-4b5c-b0ae-7fa5d728812d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.182 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[68597f94-8b93-42f3-84e4-d2349312b32d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.188 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[c83a4281-d0f2-4d94-afca-18ca3134fe2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.212 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[646da82b-f158-4243-a3da-9564c1603e04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.228 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[af69ea70-1c51-4f92-b917-8fe7febab1c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b90e9d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:d8:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 10, 'rx_bytes': 910, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 10, 'rx_bytes': 910, 'tx_bytes': 608, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431264, 'reachable_time': 25027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207895, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.240 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3024ccc9-e3c7-4d0d-9f4b-947f4f7fee04]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431271, 'tstamp': 431271}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207896, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431273, 'tstamp': 431273}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207896, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.241 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b90e9d-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.243 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.244 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b90e9d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.244 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.244 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap50b90e9d-00, col_values=(('external_ids', {'iface-id': '7e8ec4b7-6252-49aa-a342-59a2b0f3de95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:11.244 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.398 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.624 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262771.6239288, 0d215b2a-91a9-4d0b-a04e-1355b877179d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.626 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] VM Resumed (Lifecycle Event)
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.628 186180 DEBUG nova.compute.manager [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.633 186180 INFO nova.virt.libvirt.driver [-] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Instance running successfully.
Feb 16 17:26:11 compute-0 virtqemud[185389]: argument unsupported: QEMU guest agent is not configured
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.636 186180 DEBUG nova.virt.libvirt.guest [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.636 186180 DEBUG nova.virt.libvirt.driver [None req-f66149a4-ebaf-4aca-a9ce-6371a45fda80 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.652 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.655 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.682 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] During sync_power_state the instance has a pending task (resize_finish). Skip.
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.683 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262771.6251037, 0d215b2a-91a9-4d0b-a04e-1355b877179d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.683 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] VM Started (Lifecycle Event)
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.717 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.721 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:26:11 compute-0 nova_compute[186176]: 2026-02-16 17:26:11.744 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] During sync_power_state the instance has a pending task (resize_finish). Skip.
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.114 186180 DEBUG nova.compute.manager [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received event network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.114 186180 DEBUG oslo_concurrency.lockutils [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.114 186180 DEBUG oslo_concurrency.lockutils [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.114 186180 DEBUG oslo_concurrency.lockutils [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.115 186180 DEBUG nova.compute.manager [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] No waiting events found dispatching network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.115 186180 WARNING nova.compute.manager [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received unexpected event network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb for instance with vm_state resized and task_state None.
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.115 186180 DEBUG nova.compute.manager [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received event network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.115 186180 DEBUG oslo_concurrency.lockutils [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.116 186180 DEBUG oslo_concurrency.lockutils [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.116 186180 DEBUG oslo_concurrency.lockutils [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.116 186180 DEBUG nova.compute.manager [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] No waiting events found dispatching network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.116 186180 WARNING nova.compute.manager [req-20d6d629-4202-41dc-931b-8012491194dc req-97a18716-fe46-476f-a018-e5f9d1adfd23 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received unexpected event network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb for instance with vm_state resized and task_state None.
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.789 186180 DEBUG nova.network.neutron [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Updating instance_info_cache with network_info: [{"id": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "address": "fa:16:3e:f2:13:78", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41b5d6ce-d6", "ovs_interfaceid": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.812 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-158c51e6-71fc-497d-9677-0db04ae83881" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.822 186180 DEBUG nova.network.neutron [req-bd9f0682-a0e0-4c19-92db-8b3d16217a61 req-eb5693e1-7142-4d5c-bf3b-9708aa8dc7da 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Updated VIF entry in instance network info cache for port b22b4c3f-9ce1-4cf9-ace4-601e07d884bb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.822 186180 DEBUG nova.network.neutron [req-bd9f0682-a0e0-4c19-92db-8b3d16217a61 req-eb5693e1-7142-4d5c-bf3b-9708aa8dc7da 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Updating instance_info_cache with network_info: [{"id": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "address": "fa:16:3e:dd:98:36", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb22b4c3f-9c", "ovs_interfaceid": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.833 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.834 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.834 186180 DEBUG oslo_concurrency.lockutils [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.839 186180 INFO nova.virt.libvirt.driver [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 17:26:12 compute-0 virtqemud[185389]: Domain id=4 name='instance-00000003' uuid=158c51e6-71fc-497d-9677-0db04ae83881 is tainted: custom-monitor
Feb 16 17:26:12 compute-0 nova_compute[186176]: 2026-02-16 17:26:12.851 186180 DEBUG oslo_concurrency.lockutils [req-bd9f0682-a0e0-4c19-92db-8b3d16217a61 req-eb5693e1-7142-4d5c-bf3b-9708aa8dc7da 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-0d215b2a-91a9-4d0b-a04e-1355b877179d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:26:13 compute-0 nova_compute[186176]: 2026-02-16 17:26:13.847 186180 INFO nova.virt.libvirt.driver [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 17:26:14 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 17:26:14 compute-0 systemd[207668]: Activating special unit Exit the Session...
Feb 16 17:26:14 compute-0 systemd[207668]: Stopped target Main User Target.
Feb 16 17:26:14 compute-0 systemd[207668]: Stopped target Basic System.
Feb 16 17:26:14 compute-0 systemd[207668]: Stopped target Paths.
Feb 16 17:26:14 compute-0 systemd[207668]: Stopped target Sockets.
Feb 16 17:26:14 compute-0 systemd[207668]: Stopped target Timers.
Feb 16 17:26:14 compute-0 systemd[207668]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:26:14 compute-0 systemd[207668]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 17:26:14 compute-0 systemd[207668]: Closed D-Bus User Message Bus Socket.
Feb 16 17:26:14 compute-0 systemd[207668]: Stopped Create User's Volatile Files and Directories.
Feb 16 17:26:14 compute-0 systemd[207668]: Removed slice User Application Slice.
Feb 16 17:26:14 compute-0 systemd[207668]: Reached target Shutdown.
Feb 16 17:26:14 compute-0 systemd[207668]: Finished Exit the Session.
Feb 16 17:26:14 compute-0 systemd[207668]: Reached target Exit the Session.
Feb 16 17:26:14 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 17:26:14 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 17:26:14 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 17:26:14 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 17:26:14 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 17:26:14 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 17:26:14 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 17:26:14 compute-0 nova_compute[186176]: 2026-02-16 17:26:14.854 186180 INFO nova.virt.libvirt.driver [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 17:26:14 compute-0 nova_compute[186176]: 2026-02-16 17:26:14.863 186180 DEBUG nova.compute.manager [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:26:15 compute-0 nova_compute[186176]: 2026-02-16 17:26:15.020 186180 DEBUG nova.objects.instance [None req-32fd7cca-37ef-41d4-83d4-4c12e0177aa6 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 17:26:16 compute-0 nova_compute[186176]: 2026-02-16 17:26:16.181 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:16 compute-0 nova_compute[186176]: 2026-02-16 17:26:16.400 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:21 compute-0 nova_compute[186176]: 2026-02-16 17:26:21.215 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:21 compute-0 nova_compute[186176]: 2026-02-16 17:26:21.402 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:23 compute-0 ovn_controller[96437]: 2026-02-16T17:26:23Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:98:36 10.100.0.13
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.663 186180 DEBUG oslo_concurrency.lockutils [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.664 186180 DEBUG oslo_concurrency.lockutils [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.665 186180 DEBUG oslo_concurrency.lockutils [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.666 186180 DEBUG oslo_concurrency.lockutils [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.667 186180 DEBUG oslo_concurrency.lockutils [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.670 186180 INFO nova.compute.manager [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Terminating instance
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.672 186180 DEBUG nova.compute.manager [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:26:23 compute-0 kernel: tapd83659d0-e8 (unregistering): left promiscuous mode
Feb 16 17:26:23 compute-0 NetworkManager[56463]: <info>  [1771262783.7110] device (tapd83659d0-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:26:23 compute-0 ovn_controller[96437]: 2026-02-16T17:26:23Z|00054|binding|INFO|Releasing lport d83659d0-e89e-455a-91b4-462622d79d07 from this chassis (sb_readonly=0)
Feb 16 17:26:23 compute-0 ovn_controller[96437]: 2026-02-16T17:26:23Z|00055|binding|INFO|Setting lport d83659d0-e89e-455a-91b4-462622d79d07 down in Southbound
Feb 16 17:26:23 compute-0 ovn_controller[96437]: 2026-02-16T17:26:23Z|00056|binding|INFO|Removing iface tapd83659d0-e8 ovn-installed in OVS
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.716 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.719 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.727 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:08:e1 10.100.0.6'], port_security=['fa:16:3e:35:08:e1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7a804a24-fd5e-4882-be31-38bfdfa8c2c3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a09b650a-b8da-4ec6-af84-f46bd29af7dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18cf134a-5b0b-4046-bb3d-fdfa0b081c31, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=d83659d0-e89e-455a-91b4-462622d79d07) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.727 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.728 105730 INFO neutron.agent.ovn.metadata.agent [-] Port d83659d0-e89e-455a-91b4-462622d79d07 in datapath 50b90e9d-0874-4370-ad17-1fff2c4cce15 unbound from our chassis
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.729 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.751 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[57025167-302c-4ba8-987c-524ed293ec48]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:23 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Feb 16 17:26:23 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 13.493s CPU time.
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.781 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[65d67b90-d612-4f00-973a-575ab9b56d2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:23 compute-0 systemd-machined[155631]: Machine qemu-3-instance-00000006 terminated.
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.786 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[3fabbf50-e8a7-435d-ba3d-b2fb9fbd6887]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.813 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[cbbacefc-890e-4272-adc4-a1f1cc4f45c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.833 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[bcab57b2-b3e2-4b34-8029-80bc2a5ee2c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b90e9d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:d8:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 12, 'rx_bytes': 1540, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 12, 'rx_bytes': 1540, 'tx_bytes': 692, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431264, 'reachable_time': 25027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207934, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.850 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f6dbbc2d-a9e9-411b-9f7a-725bbc7ab0d9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431271, 'tstamp': 431271}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207935, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431273, 'tstamp': 431273}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207935, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.852 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b90e9d-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.854 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.858 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b90e9d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.858 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.858 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.859 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap50b90e9d-00, col_values=(('external_ids', {'iface-id': '7e8ec4b7-6252-49aa-a342-59a2b0f3de95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:23.859 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.935 186180 INFO nova.virt.libvirt.driver [-] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Instance destroyed successfully.
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.936 186180 DEBUG nova.objects.instance [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lazy-loading 'resources' on Instance uuid 7a804a24-fd5e-4882-be31-38bfdfa8c2c3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.954 186180 DEBUG nova.virt.libvirt.vif [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:25:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-179170323',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-179170323',id=6,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:25:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-3no5x7ds',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:25:43Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=7a804a24-fd5e-4882-be31-38bfdfa8c2c3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d83659d0-e89e-455a-91b4-462622d79d07", "address": "fa:16:3e:35:08:e1", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83659d0-e8", "ovs_interfaceid": "d83659d0-e89e-455a-91b4-462622d79d07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.955 186180 DEBUG nova.network.os_vif_util [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converting VIF {"id": "d83659d0-e89e-455a-91b4-462622d79d07", "address": "fa:16:3e:35:08:e1", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd83659d0-e8", "ovs_interfaceid": "d83659d0-e89e-455a-91b4-462622d79d07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.956 186180 DEBUG nova.network.os_vif_util [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:08:e1,bridge_name='br-int',has_traffic_filtering=True,id=d83659d0-e89e-455a-91b4-462622d79d07,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83659d0-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.956 186180 DEBUG os_vif [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:08:e1,bridge_name='br-int',has_traffic_filtering=True,id=d83659d0-e89e-455a-91b4-462622d79d07,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83659d0-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.958 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.958 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd83659d0-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.960 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.963 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.965 186180 INFO os_vif [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:08:e1,bridge_name='br-int',has_traffic_filtering=True,id=d83659d0-e89e-455a-91b4-462622d79d07,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd83659d0-e8')
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.966 186180 INFO nova.virt.libvirt.driver [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Deleting instance files /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3_del
Feb 16 17:26:23 compute-0 nova_compute[186176]: 2026-02-16 17:26:23.967 186180 INFO nova.virt.libvirt.driver [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Deletion of /var/lib/nova/instances/7a804a24-fd5e-4882-be31-38bfdfa8c2c3_del complete
Feb 16 17:26:24 compute-0 nova_compute[186176]: 2026-02-16 17:26:24.068 186180 INFO nova.compute.manager [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Took 0.39 seconds to destroy the instance on the hypervisor.
Feb 16 17:26:24 compute-0 nova_compute[186176]: 2026-02-16 17:26:24.068 186180 DEBUG oslo.service.loopingcall [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:26:24 compute-0 nova_compute[186176]: 2026-02-16 17:26:24.069 186180 DEBUG nova.compute.manager [-] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:26:24 compute-0 nova_compute[186176]: 2026-02-16 17:26:24.069 186180 DEBUG nova.network.neutron [-] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:26:24 compute-0 nova_compute[186176]: 2026-02-16 17:26:24.370 186180 DEBUG nova.compute.manager [req-1417ca59-d928-48f7-9f9b-bf66ff5e08b0 req-c00d772a-29da-4d34-a4cb-323f054bb9ef 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Received event network-vif-unplugged-d83659d0-e89e-455a-91b4-462622d79d07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:24 compute-0 nova_compute[186176]: 2026-02-16 17:26:24.370 186180 DEBUG oslo_concurrency.lockutils [req-1417ca59-d928-48f7-9f9b-bf66ff5e08b0 req-c00d772a-29da-4d34-a4cb-323f054bb9ef 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:24 compute-0 nova_compute[186176]: 2026-02-16 17:26:24.371 186180 DEBUG oslo_concurrency.lockutils [req-1417ca59-d928-48f7-9f9b-bf66ff5e08b0 req-c00d772a-29da-4d34-a4cb-323f054bb9ef 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:24 compute-0 nova_compute[186176]: 2026-02-16 17:26:24.371 186180 DEBUG oslo_concurrency.lockutils [req-1417ca59-d928-48f7-9f9b-bf66ff5e08b0 req-c00d772a-29da-4d34-a4cb-323f054bb9ef 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:24 compute-0 nova_compute[186176]: 2026-02-16 17:26:24.371 186180 DEBUG nova.compute.manager [req-1417ca59-d928-48f7-9f9b-bf66ff5e08b0 req-c00d772a-29da-4d34-a4cb-323f054bb9ef 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] No waiting events found dispatching network-vif-unplugged-d83659d0-e89e-455a-91b4-462622d79d07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:24 compute-0 nova_compute[186176]: 2026-02-16 17:26:24.371 186180 DEBUG nova.compute.manager [req-1417ca59-d928-48f7-9f9b-bf66ff5e08b0 req-c00d772a-29da-4d34-a4cb-323f054bb9ef 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Received event network-vif-unplugged-d83659d0-e89e-455a-91b4-462622d79d07 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:26:25 compute-0 podman[207954]: 2026-02-16 17:26:25.09698794 +0000 UTC m=+0.071139994 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.172 186180 DEBUG nova.network.neutron [-] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.200 186180 INFO nova.compute.manager [-] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Took 1.13 seconds to deallocate network for instance.
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.265 186180 DEBUG oslo_concurrency.lockutils [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.266 186180 DEBUG oslo_concurrency.lockutils [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.274 186180 DEBUG nova.compute.manager [req-5b8c69d9-91ac-455b-8c02-c2cda7de2760 req-e250d6e0-023c-49cb-a779-ce62bf0cc80b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Received event network-vif-deleted-d83659d0-e89e-455a-91b4-462622d79d07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.428 186180 DEBUG nova.compute.provider_tree [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.449 186180 DEBUG nova.scheduler.client.report [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.480 186180 DEBUG oslo_concurrency.lockutils [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.520 186180 INFO nova.scheduler.client.report [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Deleted allocations for instance 7a804a24-fd5e-4882-be31-38bfdfa8c2c3
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.639 186180 DEBUG oslo_concurrency.lockutils [None req-2f2ca086-452f-458d-81ea-a77f2536506d 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.891 186180 DEBUG oslo_concurrency.lockutils [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "0d215b2a-91a9-4d0b-a04e-1355b877179d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.891 186180 DEBUG oslo_concurrency.lockutils [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.892 186180 DEBUG oslo_concurrency.lockutils [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.892 186180 DEBUG oslo_concurrency.lockutils [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.892 186180 DEBUG oslo_concurrency.lockutils [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.893 186180 INFO nova.compute.manager [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Terminating instance
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.894 186180 DEBUG nova.compute.manager [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:26:25 compute-0 kernel: tapb22b4c3f-9c (unregistering): left promiscuous mode
Feb 16 17:26:25 compute-0 NetworkManager[56463]: <info>  [1771262785.9194] device (tapb22b4c3f-9c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.966 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:25 compute-0 ovn_controller[96437]: 2026-02-16T17:26:25Z|00057|binding|INFO|Releasing lport b22b4c3f-9ce1-4cf9-ace4-601e07d884bb from this chassis (sb_readonly=0)
Feb 16 17:26:25 compute-0 ovn_controller[96437]: 2026-02-16T17:26:25Z|00058|binding|INFO|Setting lport b22b4c3f-9ce1-4cf9-ace4-601e07d884bb down in Southbound
Feb 16 17:26:25 compute-0 ovn_controller[96437]: 2026-02-16T17:26:25Z|00059|binding|INFO|Removing iface tapb22b4c3f-9c ovn-installed in OVS
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.969 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:25 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:25.975 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:98:36 10.100.0.13'], port_security=['fa:16:3e:dd:98:36 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0d215b2a-91a9-4d0b-a04e-1355b877179d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a09b650a-b8da-4ec6-af84-f46bd29af7dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18cf134a-5b0b-4046-bb3d-fdfa0b081c31, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=b22b4c3f-9ce1-4cf9-ace4-601e07d884bb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:26:25 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:25.978 105730 INFO neutron.agent.ovn.metadata.agent [-] Port b22b4c3f-9ce1-4cf9-ace4-601e07d884bb in datapath 50b90e9d-0874-4370-ad17-1fff2c4cce15 unbound from our chassis
Feb 16 17:26:25 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:25.979 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:26:25 compute-0 nova_compute[186176]: 2026-02-16 17:26:25.979 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:25 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:25.996 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[0559ff90-d109-463e-8432-cc2b56332902]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:26 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Feb 16 17:26:26 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 11.338s CPU time.
Feb 16 17:26:26 compute-0 systemd-machined[155631]: Machine qemu-5-instance-00000005 terminated.
Feb 16 17:26:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:26.022 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[52768b07-e465-4e9d-877b-54b542fbd2f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:26.026 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[ba79c97a-3929-4af1-aaed-d3f2fdbca7b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:26.052 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[da78c5de-47b1-4927-946f-c29170850cab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:26.070 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[1ddf789a-02fe-4149-a178-b8f06d4fbe80]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b90e9d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:d8:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 14, 'rx_bytes': 1540, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 14, 'rx_bytes': 1540, 'tx_bytes': 776, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431264, 'reachable_time': 25027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 207986, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:26.089 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[8caf99f7-a301-476b-b0ac-6e4d87d4e731]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431271, 'tstamp': 431271}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207987, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431273, 'tstamp': 431273}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 207987, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:26.092 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b90e9d-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.095 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:26.100 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b90e9d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.099 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:26.100 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:26.103 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap50b90e9d-00, col_values=(('external_ids', {'iface-id': '7e8ec4b7-6252-49aa-a342-59a2b0f3de95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:26.104 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.158 186180 INFO nova.virt.libvirt.driver [-] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Instance destroyed successfully.
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.159 186180 DEBUG nova.objects.instance [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lazy-loading 'resources' on Instance uuid 0d215b2a-91a9-4d0b-a04e-1355b877179d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.178 186180 DEBUG nova.virt.libvirt.vif [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:25:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-892323844',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-892323844',id=5,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:26:11Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-kkz23cbt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:26:19Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=0d215b2a-91a9-4d0b-a04e-1355b877179d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "address": "fa:16:3e:dd:98:36", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb22b4c3f-9c", "ovs_interfaceid": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.178 186180 DEBUG nova.network.os_vif_util [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converting VIF {"id": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "address": "fa:16:3e:dd:98:36", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb22b4c3f-9c", "ovs_interfaceid": "b22b4c3f-9ce1-4cf9-ace4-601e07d884bb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.179 186180 DEBUG nova.network.os_vif_util [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:98:36,bridge_name='br-int',has_traffic_filtering=True,id=b22b4c3f-9ce1-4cf9-ace4-601e07d884bb,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb22b4c3f-9c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.179 186180 DEBUG os_vif [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:98:36,bridge_name='br-int',has_traffic_filtering=True,id=b22b4c3f-9ce1-4cf9-ace4-601e07d884bb,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb22b4c3f-9c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.180 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.180 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb22b4c3f-9c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.182 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.184 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.187 186180 INFO os_vif [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:98:36,bridge_name='br-int',has_traffic_filtering=True,id=b22b4c3f-9ce1-4cf9-ace4-601e07d884bb,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb22b4c3f-9c')
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.188 186180 INFO nova.virt.libvirt.driver [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Deleting instance files /var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d_del
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.191 186180 INFO nova.virt.libvirt.driver [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Deletion of /var/lib/nova/instances/0d215b2a-91a9-4d0b-a04e-1355b877179d_del complete
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.269 186180 INFO nova.compute.manager [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Took 0.37 seconds to destroy the instance on the hypervisor.
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.270 186180 DEBUG oslo.service.loopingcall [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.270 186180 DEBUG nova.compute.manager [-] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.270 186180 DEBUG nova.network.neutron [-] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.404 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.544 186180 DEBUG nova.compute.manager [req-16078ad9-b16b-420e-aa9a-e18dacb4ed30 req-5ed05660-570d-4038-963f-3811537d148b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Received event network-vif-plugged-d83659d0-e89e-455a-91b4-462622d79d07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.545 186180 DEBUG oslo_concurrency.lockutils [req-16078ad9-b16b-420e-aa9a-e18dacb4ed30 req-5ed05660-570d-4038-963f-3811537d148b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.546 186180 DEBUG oslo_concurrency.lockutils [req-16078ad9-b16b-420e-aa9a-e18dacb4ed30 req-5ed05660-570d-4038-963f-3811537d148b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.546 186180 DEBUG oslo_concurrency.lockutils [req-16078ad9-b16b-420e-aa9a-e18dacb4ed30 req-5ed05660-570d-4038-963f-3811537d148b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "7a804a24-fd5e-4882-be31-38bfdfa8c2c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.547 186180 DEBUG nova.compute.manager [req-16078ad9-b16b-420e-aa9a-e18dacb4ed30 req-5ed05660-570d-4038-963f-3811537d148b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] No waiting events found dispatching network-vif-plugged-d83659d0-e89e-455a-91b4-462622d79d07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.547 186180 WARNING nova.compute.manager [req-16078ad9-b16b-420e-aa9a-e18dacb4ed30 req-5ed05660-570d-4038-963f-3811537d148b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Received unexpected event network-vif-plugged-d83659d0-e89e-455a-91b4-462622d79d07 for instance with vm_state deleted and task_state None.
Feb 16 17:26:26 compute-0 nova_compute[186176]: 2026-02-16 17:26:26.982 186180 DEBUG nova.network.neutron [-] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.001 186180 INFO nova.compute.manager [-] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Took 0.73 seconds to deallocate network for instance.
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.059 186180 DEBUG oslo_concurrency.lockutils [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.060 186180 DEBUG oslo_concurrency.lockutils [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.064 186180 DEBUG oslo_concurrency.lockutils [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.099 186180 INFO nova.scheduler.client.report [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Deleted allocations for instance 0d215b2a-91a9-4d0b-a04e-1355b877179d
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.159 186180 DEBUG oslo_concurrency.lockutils [None req-9853fee4-a733-4a1e-867e-d264bde45b60 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.375 186180 DEBUG nova.compute.manager [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received event network-vif-unplugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.376 186180 DEBUG oslo_concurrency.lockutils [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.376 186180 DEBUG oslo_concurrency.lockutils [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.377 186180 DEBUG oslo_concurrency.lockutils [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.377 186180 DEBUG nova.compute.manager [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] No waiting events found dispatching network-vif-unplugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.378 186180 WARNING nova.compute.manager [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received unexpected event network-vif-unplugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb for instance with vm_state deleted and task_state None.
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.378 186180 DEBUG nova.compute.manager [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received event network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.379 186180 DEBUG oslo_concurrency.lockutils [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.379 186180 DEBUG oslo_concurrency.lockutils [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.380 186180 DEBUG oslo_concurrency.lockutils [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0d215b2a-91a9-4d0b-a04e-1355b877179d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.380 186180 DEBUG nova.compute.manager [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] No waiting events found dispatching network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.380 186180 WARNING nova.compute.manager [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received unexpected event network-vif-plugged-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb for instance with vm_state deleted and task_state None.
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.381 186180 DEBUG nova.compute.manager [req-404dc03e-3532-4753-9c23-3770466a8044 req-6ad477f4-1e42-428d-b825-d63a3bac2125 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Received event network-vif-deleted-b22b4c3f-9ce1-4cf9-ace4-601e07d884bb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.995 186180 DEBUG oslo_concurrency.lockutils [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "49698b66-fe7c-4448-88b5-13f0281298da" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.996 186180 DEBUG oslo_concurrency.lockutils [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.997 186180 DEBUG oslo_concurrency.lockutils [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "49698b66-fe7c-4448-88b5-13f0281298da-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.997 186180 DEBUG oslo_concurrency.lockutils [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:27 compute-0 nova_compute[186176]: 2026-02-16 17:26:27.998 186180 DEBUG oslo_concurrency.lockutils [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.000 186180 INFO nova.compute.manager [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Terminating instance
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.003 186180 DEBUG nova.compute.manager [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:26:28 compute-0 kernel: taped2a59c8-33 (unregistering): left promiscuous mode
Feb 16 17:26:28 compute-0 ovn_controller[96437]: 2026-02-16T17:26:28Z|00060|binding|INFO|Releasing lport ed2a59c8-33d0-43c7-bb70-bee7dc282734 from this chassis (sb_readonly=0)
Feb 16 17:26:28 compute-0 ovn_controller[96437]: 2026-02-16T17:26:28Z|00061|binding|INFO|Setting lport ed2a59c8-33d0-43c7-bb70-bee7dc282734 down in Southbound
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.035 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:28 compute-0 ovn_controller[96437]: 2026-02-16T17:26:28Z|00062|binding|INFO|Removing iface taped2a59c8-33 ovn-installed in OVS
Feb 16 17:26:28 compute-0 NetworkManager[56463]: <info>  [1771262788.0371] device (taped2a59c8-33): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.045 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:d1:25 10.100.0.11'], port_security=['fa:16:3e:c0:d1:25 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '49698b66-fe7c-4448-88b5-13f0281298da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a09b650a-b8da-4ec6-af84-f46bd29af7dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18cf134a-5b0b-4046-bb3d-fdfa0b081c31, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=ed2a59c8-33d0-43c7-bb70-bee7dc282734) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.047 105730 INFO neutron.agent.ovn.metadata.agent [-] Port ed2a59c8-33d0-43c7-bb70-bee7dc282734 in datapath 50b90e9d-0874-4370-ad17-1fff2c4cce15 unbound from our chassis
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.048 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 50b90e9d-0874-4370-ad17-1fff2c4cce15
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.049 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.066 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[4c32d960-a1d3-49a6-9ffd-c0d16d8d44bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:28 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Feb 16 17:26:28 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 13.780s CPU time.
Feb 16 17:26:28 compute-0 systemd-machined[155631]: Machine qemu-2-instance-00000004 terminated.
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.097 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[682c7d23-8917-41d7-beba-92b8b44cf06a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.101 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[fec7a389-e08d-404c-9dec-790894d3ecba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.125 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[2a37b199-f72d-40d5-989e-4b81bb3908ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.145 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[58ab7188-8dec-40af-b39a-4ef237c4650b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50b90e9d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8c:d8:89'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 16, 'rx_bytes': 1540, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 16, 'rx_bytes': 1540, 'tx_bytes': 860, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431264, 'reachable_time': 25027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208025, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.162 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[cefe39ac-72bc-4507-9954-c8f16bb541e5]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431271, 'tstamp': 431271}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208027, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap50b90e9d-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431273, 'tstamp': 431273}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208027, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.164 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b90e9d-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.166 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.174 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50b90e9d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.175 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.175 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap50b90e9d-00, col_values=(('external_ids', {'iface-id': '7e8ec4b7-6252-49aa-a342-59a2b0f3de95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:28 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:28.175 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.177 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:28 compute-0 podman[208017]: 2026-02-16 17:26:28.22826277 +0000 UTC m=+0.097809215 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.263 186180 INFO nova.virt.libvirt.driver [-] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Instance destroyed successfully.
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.263 186180 DEBUG nova.objects.instance [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lazy-loading 'resources' on Instance uuid 49698b66-fe7c-4448-88b5-13f0281298da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.279 186180 DEBUG nova.virt.libvirt.vif [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:25:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1920538887',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1920538887',id=4,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:25:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-g4zymj4c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:25:17Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=49698b66-fe7c-4448-88b5-13f0281298da,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.280 186180 DEBUG nova.network.os_vif_util [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converting VIF {"id": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "address": "fa:16:3e:c0:d1:25", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "taped2a59c8-33", "ovs_interfaceid": "ed2a59c8-33d0-43c7-bb70-bee7dc282734", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.280 186180 DEBUG nova.network.os_vif_util [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:d1:25,bridge_name='br-int',has_traffic_filtering=True,id=ed2a59c8-33d0-43c7-bb70-bee7dc282734,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped2a59c8-33') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.281 186180 DEBUG os_vif [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:d1:25,bridge_name='br-int',has_traffic_filtering=True,id=ed2a59c8-33d0-43c7-bb70-bee7dc282734,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped2a59c8-33') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.282 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.282 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped2a59c8-33, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.284 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.286 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.288 186180 INFO os_vif [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:d1:25,bridge_name='br-int',has_traffic_filtering=True,id=ed2a59c8-33d0-43c7-bb70-bee7dc282734,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='taped2a59c8-33')
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.289 186180 INFO nova.virt.libvirt.driver [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Deleting instance files /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da_del
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.289 186180 INFO nova.virt.libvirt.driver [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Deletion of /var/lib/nova/instances/49698b66-fe7c-4448-88b5-13f0281298da_del complete
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.347 186180 INFO nova.compute.manager [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Took 0.34 seconds to destroy the instance on the hypervisor.
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.347 186180 DEBUG oslo.service.loopingcall [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.347 186180 DEBUG nova.compute.manager [-] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.348 186180 DEBUG nova.network.neutron [-] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.943 186180 DEBUG nova.network.neutron [-] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:26:28 compute-0 nova_compute[186176]: 2026-02-16 17:26:28.962 186180 INFO nova.compute.manager [-] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Took 0.61 seconds to deallocate network for instance.
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.030 186180 DEBUG oslo_concurrency.lockutils [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.030 186180 DEBUG oslo_concurrency.lockutils [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.113 186180 DEBUG nova.compute.provider_tree [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.133 186180 DEBUG nova.scheduler.client.report [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.181 186180 DEBUG oslo_concurrency.lockutils [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.206 186180 INFO nova.scheduler.client.report [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Deleted allocations for instance 49698b66-fe7c-4448-88b5-13f0281298da
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.281 186180 DEBUG oslo_concurrency.lockutils [None req-d4333021-224e-4d6f-90d2-181fb894366b 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.484 186180 DEBUG nova.compute.manager [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Received event network-vif-unplugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.484 186180 DEBUG oslo_concurrency.lockutils [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "49698b66-fe7c-4448-88b5-13f0281298da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.485 186180 DEBUG oslo_concurrency.lockutils [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.485 186180 DEBUG oslo_concurrency.lockutils [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.485 186180 DEBUG nova.compute.manager [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] No waiting events found dispatching network-vif-unplugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.486 186180 WARNING nova.compute.manager [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Received unexpected event network-vif-unplugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 for instance with vm_state deleted and task_state None.
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.486 186180 DEBUG nova.compute.manager [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Received event network-vif-plugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.486 186180 DEBUG oslo_concurrency.lockutils [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "49698b66-fe7c-4448-88b5-13f0281298da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.487 186180 DEBUG oslo_concurrency.lockutils [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.487 186180 DEBUG oslo_concurrency.lockutils [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "49698b66-fe7c-4448-88b5-13f0281298da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.488 186180 DEBUG nova.compute.manager [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] No waiting events found dispatching network-vif-plugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.488 186180 WARNING nova.compute.manager [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Received unexpected event network-vif-plugged-ed2a59c8-33d0-43c7-bb70-bee7dc282734 for instance with vm_state deleted and task_state None.
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.488 186180 DEBUG nova.compute.manager [req-c0a831f1-4bf8-4abc-9bc9-aeff22e65d4c req-1dd6281c-edec-4d93-a5fe-b7971a212d8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Received event network-vif-deleted-ed2a59c8-33d0-43c7-bb70-bee7dc282734 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:29 compute-0 podman[195505]: time="2026-02-16T17:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:26:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:26:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2634 "" "Go-http-client/1.1"
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.987 186180 DEBUG oslo_concurrency.lockutils [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "158c51e6-71fc-497d-9677-0db04ae83881" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.988 186180 DEBUG oslo_concurrency.lockutils [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "158c51e6-71fc-497d-9677-0db04ae83881" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.989 186180 DEBUG oslo_concurrency.lockutils [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "158c51e6-71fc-497d-9677-0db04ae83881-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.989 186180 DEBUG oslo_concurrency.lockutils [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "158c51e6-71fc-497d-9677-0db04ae83881-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.990 186180 DEBUG oslo_concurrency.lockutils [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "158c51e6-71fc-497d-9677-0db04ae83881-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.993 186180 INFO nova.compute.manager [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Terminating instance
Feb 16 17:26:29 compute-0 nova_compute[186176]: 2026-02-16 17:26:29.994 186180 DEBUG nova.compute.manager [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:26:30 compute-0 kernel: tap41b5d6ce-d6 (unregistering): left promiscuous mode
Feb 16 17:26:30 compute-0 NetworkManager[56463]: <info>  [1771262790.0189] device (tap41b5d6ce-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:26:30 compute-0 ovn_controller[96437]: 2026-02-16T17:26:30Z|00063|binding|INFO|Releasing lport 41b5d6ce-d60c-4a88-8387-dca85adb1373 from this chassis (sb_readonly=0)
Feb 16 17:26:30 compute-0 ovn_controller[96437]: 2026-02-16T17:26:30Z|00064|binding|INFO|Setting lport 41b5d6ce-d60c-4a88-8387-dca85adb1373 down in Southbound
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.021 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:30 compute-0 ovn_controller[96437]: 2026-02-16T17:26:30Z|00065|binding|INFO|Removing iface tap41b5d6ce-d6 ovn-installed in OVS
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.023 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.031 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.031 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:13:78 10.100.0.12'], port_security=['fa:16:3e:f2:13:78 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '158c51e6-71fc-497d-9677-0db04ae83881', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '97a4c97daa7a495f91b4f65a132f7c0f', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'a09b650a-b8da-4ec6-af84-f46bd29af7dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18cf134a-5b0b-4046-bb3d-fdfa0b081c31, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=41b5d6ce-d60c-4a88-8387-dca85adb1373) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.036 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 41b5d6ce-d60c-4a88-8387-dca85adb1373 in datapath 50b90e9d-0874-4370-ad17-1fff2c4cce15 unbound from our chassis
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.039 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 50b90e9d-0874-4370-ad17-1fff2c4cce15, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.040 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[c2c4485f-c0e4-4ef4-b66a-64fade381e68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.041 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15 namespace which is not needed anymore
Feb 16 17:26:30 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Deactivated successfully.
Feb 16 17:26:30 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Consumed 2.713s CPU time.
Feb 16 17:26:30 compute-0 systemd-machined[155631]: Machine qemu-4-instance-00000003 terminated.
Feb 16 17:26:30 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[207417]: [NOTICE]   (207421) : haproxy version is 2.8.14-c23fe91
Feb 16 17:26:30 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[207417]: [NOTICE]   (207421) : path to executable is /usr/sbin/haproxy
Feb 16 17:26:30 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[207417]: [WARNING]  (207421) : Exiting Master process...
Feb 16 17:26:30 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[207417]: [WARNING]  (207421) : Exiting Master process...
Feb 16 17:26:30 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[207417]: [ALERT]    (207421) : Current worker (207423) exited with code 143 (Terminated)
Feb 16 17:26:30 compute-0 neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15[207417]: [WARNING]  (207421) : All workers exited. Exiting... (0)
Feb 16 17:26:30 compute-0 systemd[1]: libpod-d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313.scope: Deactivated successfully.
Feb 16 17:26:30 compute-0 conmon[207417]: conmon d7ebfa498eb5528536fd <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313.scope/container/memory.events
Feb 16 17:26:30 compute-0 podman[208081]: 2026-02-16 17:26:30.176789009 +0000 UTC m=+0.051634559 container died d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 16 17:26:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313-userdata-shm.mount: Deactivated successfully.
Feb 16 17:26:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-1e53cb9178fa9571351b9fc56bc15852c57a57b23810b531f4258b50d611da39-merged.mount: Deactivated successfully.
Feb 16 17:26:30 compute-0 NetworkManager[56463]: <info>  [1771262790.2180] manager: (tap41b5d6ce-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Feb 16 17:26:30 compute-0 podman[208081]: 2026-02-16 17:26:30.223462616 +0000 UTC m=+0.098308186 container cleanup d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 17:26:30 compute-0 systemd[1]: libpod-conmon-d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313.scope: Deactivated successfully.
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.259 186180 INFO nova.virt.libvirt.driver [-] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Instance destroyed successfully.
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.259 186180 DEBUG nova.objects.instance [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lazy-loading 'resources' on Instance uuid 158c51e6-71fc-497d-9677-0db04ae83881 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.283 186180 DEBUG nova.virt.libvirt.vif [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T17:24:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1670663905',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1670663905',id=3,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:25:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='97a4c97daa7a495f91b4f65a132f7c0f',ramdisk_id='',reservation_id='r-q3y1lfby',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-900316824',owner_user_name='tempest-TestExecuteActionsViaActuator-900316824-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:26:16Z,user_data=None,user_id='04e81d9e145a466bbabfe4fdaf9f09aa',uuid=158c51e6-71fc-497d-9677-0db04ae83881,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "address": "fa:16:3e:f2:13:78", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41b5d6ce-d6", "ovs_interfaceid": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.283 186180 DEBUG nova.network.os_vif_util [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converting VIF {"id": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "address": "fa:16:3e:f2:13:78", "network": {"id": "50b90e9d-0874-4370-ad17-1fff2c4cce15", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-878924146-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "97a4c97daa7a495f91b4f65a132f7c0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41b5d6ce-d6", "ovs_interfaceid": "41b5d6ce-d60c-4a88-8387-dca85adb1373", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.285 186180 DEBUG nova.network.os_vif_util [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:13:78,bridge_name='br-int',has_traffic_filtering=True,id=41b5d6ce-d60c-4a88-8387-dca85adb1373,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41b5d6ce-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.285 186180 DEBUG os_vif [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:13:78,bridge_name='br-int',has_traffic_filtering=True,id=41b5d6ce-d60c-4a88-8387-dca85adb1373,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41b5d6ce-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.286 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.286 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41b5d6ce-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.289 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.290 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.293 186180 INFO os_vif [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:13:78,bridge_name='br-int',has_traffic_filtering=True,id=41b5d6ce-d60c-4a88-8387-dca85adb1373,network=Network(50b90e9d-0874-4370-ad17-1fff2c4cce15),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41b5d6ce-d6')
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.293 186180 INFO nova.virt.libvirt.driver [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Deleting instance files /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881_del
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.294 186180 INFO nova.virt.libvirt.driver [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Deletion of /var/lib/nova/instances/158c51e6-71fc-497d-9677-0db04ae83881_del complete
Feb 16 17:26:30 compute-0 podman[208125]: 2026-02-16 17:26:30.321008493 +0000 UTC m=+0.066200094 container remove d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.325 186180 DEBUG nova.compute.manager [req-984b322f-7c2e-4bce-b269-870b5a1e28e5 req-c51b5a0e-890d-43c5-9c28-11423aab2b56 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Received event network-vif-unplugged-41b5d6ce-d60c-4a88-8387-dca85adb1373 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.326 186180 DEBUG oslo_concurrency.lockutils [req-984b322f-7c2e-4bce-b269-870b5a1e28e5 req-c51b5a0e-890d-43c5-9c28-11423aab2b56 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "158c51e6-71fc-497d-9677-0db04ae83881-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.326 186180 DEBUG oslo_concurrency.lockutils [req-984b322f-7c2e-4bce-b269-870b5a1e28e5 req-c51b5a0e-890d-43c5-9c28-11423aab2b56 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "158c51e6-71fc-497d-9677-0db04ae83881-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.326 186180 DEBUG oslo_concurrency.lockutils [req-984b322f-7c2e-4bce-b269-870b5a1e28e5 req-c51b5a0e-890d-43c5-9c28-11423aab2b56 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "158c51e6-71fc-497d-9677-0db04ae83881-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.326 186180 DEBUG nova.compute.manager [req-984b322f-7c2e-4bce-b269-870b5a1e28e5 req-c51b5a0e-890d-43c5-9c28-11423aab2b56 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] No waiting events found dispatching network-vif-unplugged-41b5d6ce-d60c-4a88-8387-dca85adb1373 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.327 186180 DEBUG nova.compute.manager [req-984b322f-7c2e-4bce-b269-870b5a1e28e5 req-c51b5a0e-890d-43c5-9c28-11423aab2b56 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Received event network-vif-unplugged-41b5d6ce-d60c-4a88-8387-dca85adb1373 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.326 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f8260a98-62b1-46ca-a804-270a6da5cb6e]: (4, ('Mon Feb 16 05:26:30 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15 (d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313)\nd7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313\nMon Feb 16 05:26:30 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15 (d7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313)\nd7ebfa498eb5528536fd0b046cd8a078aebee954e8da15a0aede8f67fcd89313\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.329 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[10198d43-d88c-4bac-8af2-98c71b5ec18c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.330 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50b90e9d-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.332 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:30 compute-0 kernel: tap50b90e9d-00: left promiscuous mode
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.334 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.336 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.336 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.338 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd60d15-1e88-4ce1-a198-916500b5155a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.339 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.352 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[498f8173-48ea-4e58-82f9-8219ed8de036]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.354 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[952b7998-0ab4-432c-9900-283d740d56a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.357 186180 INFO nova.compute.manager [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.358 186180 DEBUG oslo.service.loopingcall [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.358 186180 DEBUG nova.compute.manager [-] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.358 186180 DEBUG nova.network.neutron [-] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:26:30 compute-0 nova_compute[186176]: 2026-02-16 17:26:30.364 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.367 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[6d97772f-1ca9-4c02-a7eb-9787b5207dfd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431258, 'reachable_time': 28493, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208144, 'error': None, 'target': 'ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.370 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-50b90e9d-0874-4370-ad17-1fff2c4cce15 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:26:30 compute-0 systemd[1]: run-netns-ovnmeta\x2d50b90e9d\x2d0874\x2d4370\x2dad17\x2d1fff2c4cce15.mount: Deactivated successfully.
Feb 16 17:26:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:30.370 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[88badb08-d9ff-4676-80f2-f7a6cf031ccc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.039 186180 DEBUG nova.network.neutron [-] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.062 186180 INFO nova.compute.manager [-] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Took 0.70 seconds to deallocate network for instance.
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.141 186180 DEBUG oslo_concurrency.lockutils [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.141 186180 DEBUG oslo_concurrency.lockutils [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.146 186180 DEBUG oslo_concurrency.lockutils [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.281 186180 INFO nova.scheduler.client.report [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Deleted allocations for instance 158c51e6-71fc-497d-9677-0db04ae83881
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.316 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.371 186180 DEBUG oslo_concurrency.lockutils [None req-0e385d98-6498-4d9a-946c-4133066a2d9a 04e81d9e145a466bbabfe4fdaf9f09aa 97a4c97daa7a495f91b4f65a132f7c0f - - default default] Lock "158c51e6-71fc-497d-9677-0db04ae83881" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.405 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:31 compute-0 openstack_network_exporter[198360]: ERROR   17:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:26:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:26:31 compute-0 openstack_network_exporter[198360]: ERROR   17:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:26:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.604 186180 DEBUG nova.compute.manager [req-e35e4a8d-1954-4556-9987-eb26abf35fae req-61bf42cf-4fa2-43ab-b050-ed083b33f223 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Received event network-vif-deleted-41b5d6ce-d60c-4a88-8387-dca85adb1373 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.894 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-158c51e6-71fc-497d-9677-0db04ae83881" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.895 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-158c51e6-71fc-497d-9677-0db04ae83881" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.895 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.895 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 158c51e6-71fc-497d-9677-0db04ae83881 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:26:31 compute-0 nova_compute[186176]: 2026-02-16 17:26:31.930 186180 DEBUG nova.compute.utils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.140 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.401 186180 DEBUG nova.compute.manager [req-35a174e8-5dc5-4d0a-a4de-db54f46be824 req-72f75f03-a810-456a-8617-8ba3769132ca 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Received event network-vif-plugged-41b5d6ce-d60c-4a88-8387-dca85adb1373 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.402 186180 DEBUG oslo_concurrency.lockutils [req-35a174e8-5dc5-4d0a-a4de-db54f46be824 req-72f75f03-a810-456a-8617-8ba3769132ca 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "158c51e6-71fc-497d-9677-0db04ae83881-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.402 186180 DEBUG oslo_concurrency.lockutils [req-35a174e8-5dc5-4d0a-a4de-db54f46be824 req-72f75f03-a810-456a-8617-8ba3769132ca 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "158c51e6-71fc-497d-9677-0db04ae83881-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.402 186180 DEBUG oslo_concurrency.lockutils [req-35a174e8-5dc5-4d0a-a4de-db54f46be824 req-72f75f03-a810-456a-8617-8ba3769132ca 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "158c51e6-71fc-497d-9677-0db04ae83881-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.402 186180 DEBUG nova.compute.manager [req-35a174e8-5dc5-4d0a-a4de-db54f46be824 req-72f75f03-a810-456a-8617-8ba3769132ca 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] No waiting events found dispatching network-vif-plugged-41b5d6ce-d60c-4a88-8387-dca85adb1373 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.403 186180 WARNING nova.compute.manager [req-35a174e8-5dc5-4d0a-a4de-db54f46be824 req-72f75f03-a810-456a-8617-8ba3769132ca 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Received unexpected event network-vif-plugged-41b5d6ce-d60c-4a88-8387-dca85adb1373 for instance with vm_state deleted and task_state None.
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.494 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.517 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-158c51e6-71fc-497d-9677-0db04ae83881" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.518 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:26:32 compute-0 nova_compute[186176]: 2026-02-16 17:26:32.519 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:33 compute-0 nova_compute[186176]: 2026-02-16 17:26:33.329 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:33 compute-0 nova_compute[186176]: 2026-02-16 17:26:33.329 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:33 compute-0 nova_compute[186176]: 2026-02-16 17:26:33.329 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:26:34 compute-0 nova_compute[186176]: 2026-02-16 17:26:34.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:35 compute-0 podman[208146]: 2026-02-16 17:26:35.072860169 +0000 UTC m=+0.047338824 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:26:35 compute-0 nova_compute[186176]: 2026-02-16 17:26:35.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:35 compute-0 nova_compute[186176]: 2026-02-16 17:26:35.375 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:35 compute-0 podman[208145]: 2026-02-16 17:26:35.421987867 +0000 UTC m=+0.397146329 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.342 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.406 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.473 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.474 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5802MB free_disk=73.22799682617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.474 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.475 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.524 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.524 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.671 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.695 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.725 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:26:36 compute-0 nova_compute[186176]: 2026-02-16 17:26:36.726 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.252s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:37 compute-0 nova_compute[186176]: 2026-02-16 17:26:37.723 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:37 compute-0 nova_compute[186176]: 2026-02-16 17:26:37.724 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:38.154 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:26:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:38.154 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:26:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:38.154 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:26:38 compute-0 nova_compute[186176]: 2026-02-16 17:26:38.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:38 compute-0 nova_compute[186176]: 2026-02-16 17:26:38.933 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771262783.932639, 7a804a24-fd5e-4882-be31-38bfdfa8c2c3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:26:38 compute-0 nova_compute[186176]: 2026-02-16 17:26:38.934 186180 INFO nova.compute.manager [-] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] VM Stopped (Lifecycle Event)
Feb 16 17:26:38 compute-0 nova_compute[186176]: 2026-02-16 17:26:38.958 186180 DEBUG nova.compute.manager [None req-dbd92a2e-1de0-4708-8aab-c0fae3b00336 - - - - - -] [instance: 7a804a24-fd5e-4882-be31-38bfdfa8c2c3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:26:40 compute-0 nova_compute[186176]: 2026-02-16 17:26:40.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:40 compute-0 nova_compute[186176]: 2026-02-16 17:26:40.379 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:41 compute-0 nova_compute[186176]: 2026-02-16 17:26:41.155 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771262786.1536748, 0d215b2a-91a9-4d0b-a04e-1355b877179d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:26:41 compute-0 nova_compute[186176]: 2026-02-16 17:26:41.156 186180 INFO nova.compute.manager [-] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] VM Stopped (Lifecycle Event)
Feb 16 17:26:41 compute-0 nova_compute[186176]: 2026-02-16 17:26:41.174 186180 DEBUG nova.compute.manager [None req-d246e1f0-a4b0-45fd-887d-c8408a103b94 - - - - - -] [instance: 0d215b2a-91a9-4d0b-a04e-1355b877179d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:26:41 compute-0 nova_compute[186176]: 2026-02-16 17:26:41.409 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:43 compute-0 nova_compute[186176]: 2026-02-16 17:26:43.176 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:26:43 compute-0 nova_compute[186176]: 2026-02-16 17:26:43.262 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771262788.260756, 49698b66-fe7c-4448-88b5-13f0281298da => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:26:43 compute-0 nova_compute[186176]: 2026-02-16 17:26:43.262 186180 INFO nova.compute.manager [-] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] VM Stopped (Lifecycle Event)
Feb 16 17:26:43 compute-0 nova_compute[186176]: 2026-02-16 17:26:43.284 186180 DEBUG nova.compute.manager [None req-7b37f021-7c61-47f7-b9b7-1196005148d7 - - - - - -] [instance: 49698b66-fe7c-4448-88b5-13f0281298da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:26:45 compute-0 nova_compute[186176]: 2026-02-16 17:26:45.257 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771262790.2556446, 158c51e6-71fc-497d-9677-0db04ae83881 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:26:45 compute-0 nova_compute[186176]: 2026-02-16 17:26:45.259 186180 INFO nova.compute.manager [-] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] VM Stopped (Lifecycle Event)
Feb 16 17:26:45 compute-0 nova_compute[186176]: 2026-02-16 17:26:45.280 186180 DEBUG nova.compute.manager [None req-0d01d6d9-2567-46ae-9fde-040b32f7bd6f - - - - - -] [instance: 158c51e6-71fc-497d-9677-0db04ae83881] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:26:45 compute-0 nova_compute[186176]: 2026-02-16 17:26:45.382 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:46 compute-0 nova_compute[186176]: 2026-02-16 17:26:46.253 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:46 compute-0 nova_compute[186176]: 2026-02-16 17:26:46.411 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:50 compute-0 nova_compute[186176]: 2026-02-16 17:26:50.385 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:50 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:50.515 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:26:50 compute-0 nova_compute[186176]: 2026-02-16 17:26:50.515 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:50 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:50.517 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:26:51 compute-0 nova_compute[186176]: 2026-02-16 17:26:51.413 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:54 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:26:54.520 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:26:55 compute-0 nova_compute[186176]: 2026-02-16 17:26:55.389 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:56 compute-0 podman[208194]: 2026-02-16 17:26:56.112672667 +0000 UTC m=+0.081387224 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.component=ubi9-minimal-container, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git)
Feb 16 17:26:56 compute-0 nova_compute[186176]: 2026-02-16 17:26:56.417 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:26:59 compute-0 podman[208218]: 2026-02-16 17:26:59.077649184 +0000 UTC m=+0.050380509 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 17:26:59 compute-0 podman[195505]: time="2026-02-16T17:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:26:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:26:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Feb 16 17:27:00 compute-0 nova_compute[186176]: 2026-02-16 17:27:00.391 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:01 compute-0 openstack_network_exporter[198360]: ERROR   17:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:27:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:27:01 compute-0 nova_compute[186176]: 2026-02-16 17:27:01.419 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:01 compute-0 openstack_network_exporter[198360]: ERROR   17:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:27:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:27:05 compute-0 nova_compute[186176]: 2026-02-16 17:27:05.394 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:06 compute-0 podman[208240]: 2026-02-16 17:27:06.119303533 +0000 UTC m=+0.084659788 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:27:06 compute-0 podman[208239]: 2026-02-16 17:27:06.119259602 +0000 UTC m=+0.089707973 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 17:27:06 compute-0 nova_compute[186176]: 2026-02-16 17:27:06.423 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:10 compute-0 nova_compute[186176]: 2026-02-16 17:27:10.398 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:11 compute-0 nova_compute[186176]: 2026-02-16 17:27:11.424 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:15 compute-0 nova_compute[186176]: 2026-02-16 17:27:15.402 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:16 compute-0 nova_compute[186176]: 2026-02-16 17:27:16.426 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:20 compute-0 nova_compute[186176]: 2026-02-16 17:27:20.405 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:21 compute-0 nova_compute[186176]: 2026-02-16 17:27:21.428 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:21 compute-0 ovn_controller[96437]: 2026-02-16T17:27:21Z|00066|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 17:27:22 compute-0 nova_compute[186176]: 2026-02-16 17:27:22.613 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:22 compute-0 nova_compute[186176]: 2026-02-16 17:27:22.613 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:22 compute-0 nova_compute[186176]: 2026-02-16 17:27:22.631 186180 DEBUG nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:27:22 compute-0 nova_compute[186176]: 2026-02-16 17:27:22.932 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:22 compute-0 nova_compute[186176]: 2026-02-16 17:27:22.933 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:22 compute-0 nova_compute[186176]: 2026-02-16 17:27:22.942 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:27:22 compute-0 nova_compute[186176]: 2026-02-16 17:27:22.942 186180 INFO nova.compute.claims [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.069 186180 DEBUG nova.compute.provider_tree [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.083 186180 DEBUG nova.scheduler.client.report [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.104 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.105 186180 DEBUG nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.149 186180 DEBUG nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.150 186180 DEBUG nova.network.neutron [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.168 186180 INFO nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.184 186180 DEBUG nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.293 186180 DEBUG nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.294 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.295 186180 INFO nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Creating image(s)
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.295 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "/var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.296 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "/var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.297 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "/var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.311 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.378 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.380 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.381 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.412 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.495 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.497 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.530 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.532 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.533 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.580 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.582 186180 DEBUG nova.virt.disk.api [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Checking if we can resize image /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.582 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.629 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.630 186180 DEBUG nova.virt.disk.api [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Cannot resize image /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.631 186180 DEBUG nova.objects.instance [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lazy-loading 'migration_context' on Instance uuid 96178328-ed2e-49fe-b48b-c9cc5e9d509c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.651 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.651 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Ensure instance console log exists: /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.652 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.653 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:23 compute-0 nova_compute[186176]: 2026-02-16 17:27:23.653 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:24 compute-0 nova_compute[186176]: 2026-02-16 17:27:24.189 186180 DEBUG nova.policy [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e780e424f9f94938bff44671975be4ba', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '70ed1dbc47324f8890fd0ec8599a8f86', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:27:25 compute-0 nova_compute[186176]: 2026-02-16 17:27:25.407 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:25 compute-0 nova_compute[186176]: 2026-02-16 17:27:25.628 186180 DEBUG nova.network.neutron [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Successfully created port: 5879f7be-34ad-4c93-92ab-de77c246915c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:27:26 compute-0 nova_compute[186176]: 2026-02-16 17:27:26.323 186180 DEBUG nova.network.neutron [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Successfully updated port: 5879f7be-34ad-4c93-92ab-de77c246915c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:27:26 compute-0 nova_compute[186176]: 2026-02-16 17:27:26.338 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:27:26 compute-0 nova_compute[186176]: 2026-02-16 17:27:26.339 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquired lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:27:26 compute-0 nova_compute[186176]: 2026-02-16 17:27:26.339 186180 DEBUG nova.network.neutron [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:27:26 compute-0 nova_compute[186176]: 2026-02-16 17:27:26.410 186180 DEBUG nova.compute.manager [req-37bebcd1-2717-414b-b0b8-549d42418d24 req-c207c42a-5735-417f-9cf5-61efb42be50c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Received event network-changed-5879f7be-34ad-4c93-92ab-de77c246915c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:27:26 compute-0 nova_compute[186176]: 2026-02-16 17:27:26.410 186180 DEBUG nova.compute.manager [req-37bebcd1-2717-414b-b0b8-549d42418d24 req-c207c42a-5735-417f-9cf5-61efb42be50c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Refreshing instance network info cache due to event network-changed-5879f7be-34ad-4c93-92ab-de77c246915c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:27:26 compute-0 nova_compute[186176]: 2026-02-16 17:27:26.411 186180 DEBUG oslo_concurrency.lockutils [req-37bebcd1-2717-414b-b0b8-549d42418d24 req-c207c42a-5735-417f-9cf5-61efb42be50c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:27:26 compute-0 nova_compute[186176]: 2026-02-16 17:27:26.430 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:27 compute-0 podman[208305]: 2026-02-16 17:27:27.110575563 +0000 UTC m=+0.079912111 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., version=9.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, container_name=openstack_network_exporter)
Feb 16 17:27:27 compute-0 nova_compute[186176]: 2026-02-16 17:27:27.315 186180 DEBUG nova.network.neutron [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.020 186180 DEBUG nova.network.neutron [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Updating instance_info_cache with network_info: [{"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.042 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Releasing lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.043 186180 DEBUG nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Instance network_info: |[{"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.044 186180 DEBUG oslo_concurrency.lockutils [req-37bebcd1-2717-414b-b0b8-549d42418d24 req-c207c42a-5735-417f-9cf5-61efb42be50c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.044 186180 DEBUG nova.network.neutron [req-37bebcd1-2717-414b-b0b8-549d42418d24 req-c207c42a-5735-417f-9cf5-61efb42be50c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Refreshing network info cache for port 5879f7be-34ad-4c93-92ab-de77c246915c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.049 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Start _get_guest_xml network_info=[{"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.055 186180 WARNING nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.061 186180 DEBUG nova.virt.libvirt.host [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.062 186180 DEBUG nova.virt.libvirt.host [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.071 186180 DEBUG nova.virt.libvirt.host [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.072 186180 DEBUG nova.virt.libvirt.host [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.073 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.074 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.075 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.075 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.076 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.076 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.076 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.077 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.077 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.078 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.078 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.078 186180 DEBUG nova.virt.hardware [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.085 186180 DEBUG nova.virt.libvirt.vif [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:27:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-256910617',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-256910617',id=7,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70ed1dbc47324f8890fd0ec8599a8f86',ramdisk_id='',reservation_id='r-wks0qpvt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1263148546',owner_user_name='tempest-TestExecuteBasicStrategy-1263148546-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:27:23Z,user_data=None,user_id='e780e424f9f94938bff44671975be4ba',uuid=96178328-ed2e-49fe-b48b-c9cc5e9d509c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.086 186180 DEBUG nova.network.os_vif_util [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Converting VIF {"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.087 186180 DEBUG nova.network.os_vif_util [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e9:4d,bridge_name='br-int',has_traffic_filtering=True,id=5879f7be-34ad-4c93-92ab-de77c246915c,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5879f7be-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.088 186180 DEBUG nova.objects.instance [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lazy-loading 'pci_devices' on Instance uuid 96178328-ed2e-49fe-b48b-c9cc5e9d509c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.102 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <uuid>96178328-ed2e-49fe-b48b-c9cc5e9d509c</uuid>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <name>instance-00000007</name>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteBasicStrategy-server-256910617</nova:name>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:27:29</nova:creationTime>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:27:29 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:27:29 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:27:29 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:27:29 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:27:29 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:27:29 compute-0 nova_compute[186176]:         <nova:user uuid="e780e424f9f94938bff44671975be4ba">tempest-TestExecuteBasicStrategy-1263148546-project-member</nova:user>
Feb 16 17:27:29 compute-0 nova_compute[186176]:         <nova:project uuid="70ed1dbc47324f8890fd0ec8599a8f86">tempest-TestExecuteBasicStrategy-1263148546</nova:project>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:27:29 compute-0 nova_compute[186176]:         <nova:port uuid="5879f7be-34ad-4c93-92ab-de77c246915c">
Feb 16 17:27:29 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <system>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <entry name="serial">96178328-ed2e-49fe-b48b-c9cc5e9d509c</entry>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <entry name="uuid">96178328-ed2e-49fe-b48b-c9cc5e9d509c</entry>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     </system>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <os>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   </os>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <features>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   </features>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk.config"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:1e:e9:4d"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <target dev="tap5879f7be-34"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/console.log" append="off"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <video>
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     </video>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:27:29 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:27:29 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:27:29 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:27:29 compute-0 nova_compute[186176]: </domain>
Feb 16 17:27:29 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.104 186180 DEBUG nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Preparing to wait for external event network-vif-plugged-5879f7be-34ad-4c93-92ab-de77c246915c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.105 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.106 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.106 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.108 186180 DEBUG nova.virt.libvirt.vif [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:27:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-256910617',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-256910617',id=7,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='70ed1dbc47324f8890fd0ec8599a8f86',ramdisk_id='',reservation_id='r-wks0qpvt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-1263148546',owner_user_name='tempest-TestExecuteBasicStrategy-1263148546-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:27:23Z,user_data=None,user_id='e780e424f9f94938bff44671975be4ba',uuid=96178328-ed2e-49fe-b48b-c9cc5e9d509c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.109 186180 DEBUG nova.network.os_vif_util [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Converting VIF {"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.110 186180 DEBUG nova.network.os_vif_util [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e9:4d,bridge_name='br-int',has_traffic_filtering=True,id=5879f7be-34ad-4c93-92ab-de77c246915c,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5879f7be-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.111 186180 DEBUG os_vif [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e9:4d,bridge_name='br-int',has_traffic_filtering=True,id=5879f7be-34ad-4c93-92ab-de77c246915c,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5879f7be-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.112 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.113 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.114 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.118 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.118 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5879f7be-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.119 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5879f7be-34, col_values=(('external_ids', {'iface-id': '5879f7be-34ad-4c93-92ab-de77c246915c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:e9:4d', 'vm-uuid': '96178328-ed2e-49fe-b48b-c9cc5e9d509c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.121 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:29 compute-0 NetworkManager[56463]: <info>  [1771262849.1224] manager: (tap5879f7be-34): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.124 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.129 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.131 186180 INFO os_vif [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:e9:4d,bridge_name='br-int',has_traffic_filtering=True,id=5879f7be-34ad-4c93-92ab-de77c246915c,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5879f7be-34')
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.182 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.183 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.183 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] No VIF found with MAC fa:16:3e:1e:e9:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.183 186180 INFO nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Using config drive
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.513 186180 INFO nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Creating config drive at /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk.config
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.520 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzjkwe2vo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.642 186180 DEBUG oslo_concurrency.processutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpzjkwe2vo" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:27:29 compute-0 kernel: tap5879f7be-34: entered promiscuous mode
Feb 16 17:27:29 compute-0 NetworkManager[56463]: <info>  [1771262849.7183] manager: (tap5879f7be-34): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Feb 16 17:27:29 compute-0 ovn_controller[96437]: 2026-02-16T17:27:29Z|00067|binding|INFO|Claiming lport 5879f7be-34ad-4c93-92ab-de77c246915c for this chassis.
Feb 16 17:27:29 compute-0 ovn_controller[96437]: 2026-02-16T17:27:29Z|00068|binding|INFO|5879f7be-34ad-4c93-92ab-de77c246915c: Claiming fa:16:3e:1e:e9:4d 10.100.0.4
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.762 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:29 compute-0 podman[195505]: time="2026-02-16T17:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:27:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:27:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2169 "" "Go-http-client/1.1"
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.780 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:e9:4d 10.100.0.4'], port_security=['fa:16:3e:1e:e9:4d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '96178328-ed2e-49fe-b48b-c9cc5e9d509c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70ed1dbc47324f8890fd0ec8599a8f86', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0a5399e9-f3d7-4e52-9a46-0a6482b7de76', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3aa309be-92d4-466f-ba5d-4012e3d564ca, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=5879f7be-34ad-4c93-92ab-de77c246915c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.782 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 5879f7be-34ad-4c93-92ab-de77c246915c in datapath c1062e90-d609-4ef6-8ee4-d67d0aa8101c bound to our chassis
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.783 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1062e90-d609-4ef6-8ee4-d67d0aa8101c
Feb 16 17:27:29 compute-0 systemd-udevd[208354]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.790 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:29 compute-0 ovn_controller[96437]: 2026-02-16T17:27:29Z|00069|binding|INFO|Setting lport 5879f7be-34ad-4c93-92ab-de77c246915c ovn-installed in OVS
Feb 16 17:27:29 compute-0 ovn_controller[96437]: 2026-02-16T17:27:29Z|00070|binding|INFO|Setting lport 5879f7be-34ad-4c93-92ab-de77c246915c up in Southbound
Feb 16 17:27:29 compute-0 nova_compute[186176]: 2026-02-16 17:27:29.794 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.793 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3c5f9f55-49e4-4bd9-a1ac-506c69ad0d5d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.796 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc1062e90-d1 in ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.800 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc1062e90-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.800 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ede75bc7-8e8e-4dcb-b844-a061712a4e87]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 systemd-machined[155631]: New machine qemu-6-instance-00000007.
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.801 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[545f8029-b2b8-470c-aa9d-ca0d990c7211]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000007.
Feb 16 17:27:29 compute-0 NetworkManager[56463]: <info>  [1771262849.8105] device (tap5879f7be-34): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:27:29 compute-0 NetworkManager[56463]: <info>  [1771262849.8119] device (tap5879f7be-34): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.813 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[66623103-006c-4982-b7f9-0228f0734652]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.823 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[76ce9d43-e475-4e09-a4cb-2d07c986da06]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.848 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[f7bac8dc-2b14-4664-9b34-e5f33b2a8fae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 podman[208339]: 2026-02-16 17:27:29.852979534 +0000 UTC m=+0.135457980 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 17:27:29 compute-0 NetworkManager[56463]: <info>  [1771262849.8556] manager: (tapc1062e90-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.854 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[a83bcb9e-4c18-4f9c-8554-7fec6a94dc5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.889 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf17af7-99d4-4275-a590-b16fdb204996]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.895 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb1ae8e-4c77-4a80-a6ba-d919e3de9d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 NetworkManager[56463]: <info>  [1771262849.9173] device (tapc1062e90-d0): carrier: link connected
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.924 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[451abe0c-4ecb-433d-84db-91ae4a521279]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.946 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f3f64f-c1b9-4144-b81d-90711f5384fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1062e90-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:31:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444544, 'reachable_time': 43617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208399, 'error': None, 'target': 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.963 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[050bf388-8f43-4bc1-b2d5-0e6293f8f517]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecf:3127'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444544, 'tstamp': 444544}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208401, 'error': None, 'target': 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:29.976 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[53df53ae-5f7a-4bd4-aba6-78e7593d3fc5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1062e90-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:31:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444544, 'reachable_time': 43617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208402, 'error': None, 'target': 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:30.003 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f2212d-b4d6-4d5f-aa14-6f839cb26bfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:30.049 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[850126f2-5325-41ce-9399-de9542c1d210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:30.050 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1062e90-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:30.050 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:30.051 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1062e90-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.053 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:30 compute-0 NetworkManager[56463]: <info>  [1771262850.0539] manager: (tapc1062e90-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Feb 16 17:27:30 compute-0 kernel: tapc1062e90-d0: entered promiscuous mode
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.059 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:30.060 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1062e90-d0, col_values=(('external_ids', {'iface-id': '7a5b24ec-e035-4a6a-afde-d2c30fbdd72e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.062 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:30 compute-0 ovn_controller[96437]: 2026-02-16T17:27:30Z|00071|binding|INFO|Releasing lport 7a5b24ec-e035-4a6a-afde-d2c30fbdd72e from this chassis (sb_readonly=0)
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.069 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.070 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:30.070 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c1062e90-d609-4ef6-8ee4-d67d0aa8101c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c1062e90-d609-4ef6-8ee4-d67d0aa8101c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:30.071 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[368bb699-458b-409f-af7b-e5d21c6f7805]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:30.072 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-c1062e90-d609-4ef6-8ee4-d67d0aa8101c
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/c1062e90-d609-4ef6-8ee4-d67d0aa8101c.pid.haproxy
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID c1062e90-d609-4ef6-8ee4-d67d0aa8101c
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:27:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:30.073 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'env', 'PROCESS_TAG=haproxy-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c1062e90-d609-4ef6-8ee4-d67d0aa8101c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.244 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262850.244389, 96178328-ed2e-49fe-b48b-c9cc5e9d509c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.245 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] VM Started (Lifecycle Event)
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.271 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.277 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262850.24521, 96178328-ed2e-49fe-b48b-c9cc5e9d509c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.278 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] VM Paused (Lifecycle Event)
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.306 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.312 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.332 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.393 186180 DEBUG nova.compute.manager [req-e1720326-8e86-4260-9a69-0bc6dc8c13e9 req-6e9ef9d8-a60b-4b54-bdca-041ed1d7d038 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Received event network-vif-plugged-5879f7be-34ad-4c93-92ab-de77c246915c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.394 186180 DEBUG oslo_concurrency.lockutils [req-e1720326-8e86-4260-9a69-0bc6dc8c13e9 req-6e9ef9d8-a60b-4b54-bdca-041ed1d7d038 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.394 186180 DEBUG oslo_concurrency.lockutils [req-e1720326-8e86-4260-9a69-0bc6dc8c13e9 req-6e9ef9d8-a60b-4b54-bdca-041ed1d7d038 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.394 186180 DEBUG oslo_concurrency.lockutils [req-e1720326-8e86-4260-9a69-0bc6dc8c13e9 req-6e9ef9d8-a60b-4b54-bdca-041ed1d7d038 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.395 186180 DEBUG nova.compute.manager [req-e1720326-8e86-4260-9a69-0bc6dc8c13e9 req-6e9ef9d8-a60b-4b54-bdca-041ed1d7d038 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Processing event network-vif-plugged-5879f7be-34ad-4c93-92ab-de77c246915c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.395 186180 DEBUG nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.407 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262850.4022508, 96178328-ed2e-49fe-b48b-c9cc5e9d509c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.408 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] VM Resumed (Lifecycle Event)
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.410 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.416 186180 INFO nova.virt.libvirt.driver [-] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Instance spawned successfully.
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.416 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.431 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.441 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.444 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.445 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.445 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.445 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.446 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.446 186180 DEBUG nova.virt.libvirt.driver [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.476 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:27:30 compute-0 podman[208440]: 2026-02-16 17:27:30.494351284 +0000 UTC m=+0.060193925 container create 3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.508 186180 INFO nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Took 7.21 seconds to spawn the instance on the hypervisor.
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.509 186180 DEBUG nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:27:30 compute-0 systemd[1]: Started libpod-conmon-3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a.scope.
Feb 16 17:27:30 compute-0 podman[208440]: 2026-02-16 17:27:30.468305962 +0000 UTC m=+0.034148633 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:27:30 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:27:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7a367084e238bd8d9c9b4c1c052b6a55c72be62b22d3cfb7b642e770e81ed6b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:27:30 compute-0 podman[208440]: 2026-02-16 17:27:30.584877846 +0000 UTC m=+0.150720487 container init 3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.591 186180 INFO nova.compute.manager [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Took 7.92 seconds to build instance.
Feb 16 17:27:30 compute-0 podman[208440]: 2026-02-16 17:27:30.594334539 +0000 UTC m=+0.160177160 container start 3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 17:27:30 compute-0 nova_compute[186176]: 2026-02-16 17:27:30.613 186180 DEBUG oslo_concurrency.lockutils [None req-80916825-b247-4d17-9016-070dc03b1b37 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:30 compute-0 neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c[208456]: [NOTICE]   (208460) : New worker (208462) forked
Feb 16 17:27:30 compute-0 neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c[208456]: [NOTICE]   (208460) : Loading success.
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.338 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.338 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.338 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.386 186180 DEBUG nova.network.neutron [req-37bebcd1-2717-414b-b0b8-549d42418d24 req-c207c42a-5735-417f-9cf5-61efb42be50c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Updated VIF entry in instance network info cache for port 5879f7be-34ad-4c93-92ab-de77c246915c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.387 186180 DEBUG nova.network.neutron [req-37bebcd1-2717-414b-b0b8-549d42418d24 req-c207c42a-5735-417f-9cf5-61efb42be50c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Updating instance_info_cache with network_info: [{"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.404 186180 DEBUG oslo_concurrency.lockutils [req-37bebcd1-2717-414b-b0b8-549d42418d24 req-c207c42a-5735-417f-9cf5-61efb42be50c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:27:31 compute-0 openstack_network_exporter[198360]: ERROR   17:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:27:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:27:31 compute-0 openstack_network_exporter[198360]: ERROR   17:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:27:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.432 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.469 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.470 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.470 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:27:31 compute-0 nova_compute[186176]: 2026-02-16 17:27:31.471 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 96178328-ed2e-49fe-b48b-c9cc5e9d509c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:27:32 compute-0 nova_compute[186176]: 2026-02-16 17:27:32.478 186180 DEBUG nova.compute.manager [req-84964a8a-1cf0-4eeb-9d2b-44ebedad87b8 req-b7a4e862-416a-4957-a6dd-30bfdeebb6f6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Received event network-vif-plugged-5879f7be-34ad-4c93-92ab-de77c246915c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:27:32 compute-0 nova_compute[186176]: 2026-02-16 17:27:32.478 186180 DEBUG oslo_concurrency.lockutils [req-84964a8a-1cf0-4eeb-9d2b-44ebedad87b8 req-b7a4e862-416a-4957-a6dd-30bfdeebb6f6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:32 compute-0 nova_compute[186176]: 2026-02-16 17:27:32.478 186180 DEBUG oslo_concurrency.lockutils [req-84964a8a-1cf0-4eeb-9d2b-44ebedad87b8 req-b7a4e862-416a-4957-a6dd-30bfdeebb6f6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:32 compute-0 nova_compute[186176]: 2026-02-16 17:27:32.478 186180 DEBUG oslo_concurrency.lockutils [req-84964a8a-1cf0-4eeb-9d2b-44ebedad87b8 req-b7a4e862-416a-4957-a6dd-30bfdeebb6f6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:32 compute-0 nova_compute[186176]: 2026-02-16 17:27:32.478 186180 DEBUG nova.compute.manager [req-84964a8a-1cf0-4eeb-9d2b-44ebedad87b8 req-b7a4e862-416a-4957-a6dd-30bfdeebb6f6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] No waiting events found dispatching network-vif-plugged-5879f7be-34ad-4c93-92ab-de77c246915c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:27:32 compute-0 nova_compute[186176]: 2026-02-16 17:27:32.478 186180 WARNING nova.compute.manager [req-84964a8a-1cf0-4eeb-9d2b-44ebedad87b8 req-b7a4e862-416a-4957-a6dd-30bfdeebb6f6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Received unexpected event network-vif-plugged-5879f7be-34ad-4c93-92ab-de77c246915c for instance with vm_state active and task_state None.
Feb 16 17:27:33 compute-0 nova_compute[186176]: 2026-02-16 17:27:33.186 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Updating instance_info_cache with network_info: [{"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:27:33 compute-0 nova_compute[186176]: 2026-02-16 17:27:33.205 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:27:33 compute-0 nova_compute[186176]: 2026-02-16 17:27:33.206 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:27:33 compute-0 nova_compute[186176]: 2026-02-16 17:27:33.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:27:33 compute-0 nova_compute[186176]: 2026-02-16 17:27:33.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:27:34 compute-0 nova_compute[186176]: 2026-02-16 17:27:34.121 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:35 compute-0 nova_compute[186176]: 2026-02-16 17:27:35.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:27:35 compute-0 nova_compute[186176]: 2026-02-16 17:27:35.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:27:36 compute-0 nova_compute[186176]: 2026-02-16 17:27:36.433 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:37 compute-0 podman[208473]: 2026-02-16 17:27:37.097522045 +0000 UTC m=+0.059484008 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 17:27:37 compute-0 podman[208472]: 2026-02-16 17:27:37.132104267 +0000 UTC m=+0.094999903 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:27:37 compute-0 nova_compute[186176]: 2026-02-16 17:27:37.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:27:37 compute-0 nova_compute[186176]: 2026-02-16 17:27:37.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:27:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:38.156 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:38.159 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:27:38.161 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.350 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.350 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.351 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.423 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.494 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.495 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.573 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.756 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.758 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5655MB free_disk=73.2271728515625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.759 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.760 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.846 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance 96178328-ed2e-49fe-b48b-c9cc5e9d509c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.847 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.847 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.881 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.896 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.918 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:27:38 compute-0 nova_compute[186176]: 2026-02-16 17:27:38.918 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:27:39 compute-0 nova_compute[186176]: 2026-02-16 17:27:39.124 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:39 compute-0 nova_compute[186176]: 2026-02-16 17:27:39.923 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:27:41 compute-0 nova_compute[186176]: 2026-02-16 17:27:41.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:27:41 compute-0 nova_compute[186176]: 2026-02-16 17:27:41.436 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:41 compute-0 ovn_controller[96437]: 2026-02-16T17:27:41Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:e9:4d 10.100.0.4
Feb 16 17:27:41 compute-0 ovn_controller[96437]: 2026-02-16T17:27:41Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:e9:4d 10.100.0.4
Feb 16 17:27:44 compute-0 nova_compute[186176]: 2026-02-16 17:27:44.126 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:46 compute-0 nova_compute[186176]: 2026-02-16 17:27:46.438 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:49 compute-0 nova_compute[186176]: 2026-02-16 17:27:49.128 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:51 compute-0 nova_compute[186176]: 2026-02-16 17:27:51.441 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:54 compute-0 nova_compute[186176]: 2026-02-16 17:27:54.130 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:56 compute-0 nova_compute[186176]: 2026-02-16 17:27:56.442 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:58 compute-0 podman[208539]: 2026-02-16 17:27:58.100661229 +0000 UTC m=+0.065977938 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7)
Feb 16 17:27:59 compute-0 nova_compute[186176]: 2026-02-16 17:27:59.132 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:27:59 compute-0 podman[195505]: time="2026-02-16T17:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:27:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:27:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 16 17:28:00 compute-0 podman[208560]: 2026-02-16 17:28:00.132388372 +0000 UTC m=+0.089915917 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 17:28:01 compute-0 openstack_network_exporter[198360]: ERROR   17:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:28:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:28:01 compute-0 openstack_network_exporter[198360]: ERROR   17:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:28:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:28:01 compute-0 nova_compute[186176]: 2026-02-16 17:28:01.444 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:04 compute-0 nova_compute[186176]: 2026-02-16 17:28:04.134 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:06 compute-0 nova_compute[186176]: 2026-02-16 17:28:06.447 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:08 compute-0 podman[208580]: 2026-02-16 17:28:08.093952148 +0000 UTC m=+0.058192766 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:28:08 compute-0 podman[208579]: 2026-02-16 17:28:08.122082071 +0000 UTC m=+0.088329108 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Feb 16 17:28:08 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 17:28:09 compute-0 nova_compute[186176]: 2026-02-16 17:28:09.137 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:09 compute-0 ovn_controller[96437]: 2026-02-16T17:28:09Z|00072|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 16 17:28:11 compute-0 nova_compute[186176]: 2026-02-16 17:28:11.449 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:14 compute-0 nova_compute[186176]: 2026-02-16 17:28:14.139 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:16 compute-0 nova_compute[186176]: 2026-02-16 17:28:16.451 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:19 compute-0 nova_compute[186176]: 2026-02-16 17:28:19.141 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:21 compute-0 nova_compute[186176]: 2026-02-16 17:28:21.453 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:24 compute-0 nova_compute[186176]: 2026-02-16 17:28:24.143 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:26 compute-0 nova_compute[186176]: 2026-02-16 17:28:26.456 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:29 compute-0 podman[208629]: 2026-02-16 17:28:29.110835203 +0000 UTC m=+0.069916704 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, managed_by=edpm_ansible)
Feb 16 17:28:29 compute-0 nova_compute[186176]: 2026-02-16 17:28:29.146 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:29 compute-0 podman[195505]: time="2026-02-16T17:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:28:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:28:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2633 "" "Go-http-client/1.1"
Feb 16 17:28:31 compute-0 podman[208652]: 2026-02-16 17:28:31.109634863 +0000 UTC m=+0.070542330 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 17:28:31 compute-0 nova_compute[186176]: 2026-02-16 17:28:31.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:28:31 compute-0 nova_compute[186176]: 2026-02-16 17:28:31.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:28:31 compute-0 nova_compute[186176]: 2026-02-16 17:28:31.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:28:31 compute-0 openstack_network_exporter[198360]: ERROR   17:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:28:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:28:31 compute-0 openstack_network_exporter[198360]: ERROR   17:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:28:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:28:31 compute-0 nova_compute[186176]: 2026-02-16 17:28:31.457 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:31 compute-0 nova_compute[186176]: 2026-02-16 17:28:31.529 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:28:31 compute-0 nova_compute[186176]: 2026-02-16 17:28:31.530 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:28:31 compute-0 nova_compute[186176]: 2026-02-16 17:28:31.530 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:28:31 compute-0 nova_compute[186176]: 2026-02-16 17:28:31.530 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 96178328-ed2e-49fe-b48b-c9cc5e9d509c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:28:33 compute-0 nova_compute[186176]: 2026-02-16 17:28:33.561 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Updating instance_info_cache with network_info: [{"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:28:33 compute-0 nova_compute[186176]: 2026-02-16 17:28:33.579 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-96178328-ed2e-49fe-b48b-c9cc5e9d509c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:28:33 compute-0 nova_compute[186176]: 2026-02-16 17:28:33.580 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:28:33 compute-0 nova_compute[186176]: 2026-02-16 17:28:33.581 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:28:33 compute-0 nova_compute[186176]: 2026-02-16 17:28:33.581 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:28:33 compute-0 nova_compute[186176]: 2026-02-16 17:28:33.953 186180 DEBUG nova.virt.libvirt.driver [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Creating tmpfile /var/lib/nova/instances/tmp0cd9754f to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 17:28:33 compute-0 nova_compute[186176]: 2026-02-16 17:28:33.954 186180 DEBUG nova.compute.manager [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0cd9754f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 17:28:34 compute-0 nova_compute[186176]: 2026-02-16 17:28:34.148 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:34 compute-0 nova_compute[186176]: 2026-02-16 17:28:34.926 186180 DEBUG nova.compute.manager [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0cd9754f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2653d20c-6ae2-4f6d-8d76-50640d70defd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 17:28:34 compute-0 nova_compute[186176]: 2026-02-16 17:28:34.954 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-2653d20c-6ae2-4f6d-8d76-50640d70defd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:28:34 compute-0 nova_compute[186176]: 2026-02-16 17:28:34.955 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-2653d20c-6ae2-4f6d-8d76-50640d70defd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:28:34 compute-0 nova_compute[186176]: 2026-02-16 17:28:34.956 186180 DEBUG nova.network.neutron [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:28:35 compute-0 nova_compute[186176]: 2026-02-16 17:28:35.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.458 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.699 186180 DEBUG nova.network.neutron [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Updating instance_info_cache with network_info: [{"id": "30b71dbc-27fb-4b14-90f6-296da61fd380", "address": "fa:16:3e:f8:7a:76", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b71dbc-27", "ovs_interfaceid": "30b71dbc-27fb-4b14-90f6-296da61fd380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.720 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-2653d20c-6ae2-4f6d-8d76-50640d70defd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.723 186180 DEBUG nova.virt.libvirt.driver [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0cd9754f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2653d20c-6ae2-4f6d-8d76-50640d70defd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.723 186180 DEBUG nova.virt.libvirt.driver [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Creating instance directory: /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.725 186180 DEBUG nova.virt.libvirt.driver [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Creating disk.info with the contents: {'/var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk': 'qcow2', '/var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.725 186180 DEBUG nova.virt.libvirt.driver [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.726 186180 DEBUG nova.objects.instance [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 2653d20c-6ae2-4f6d-8d76-50640d70defd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.770 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.846 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.848 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.849 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.870 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.927 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.928 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.959 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.960 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:36 compute-0 nova_compute[186176]: 2026-02-16 17:28:36.961 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.008 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.009 186180 DEBUG nova.virt.disk.api [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Checking if we can resize image /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.010 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.081 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.083 186180 DEBUG nova.virt.disk.api [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Cannot resize image /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.083 186180 DEBUG nova.objects.instance [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid 2653d20c-6ae2-4f6d-8d76-50640d70defd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.097 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.114 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk.config 485376" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.116 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk.config to /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.116 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk.config /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.569 186180 DEBUG oslo_concurrency.processutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd/disk.config /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.570 186180 DEBUG nova.virt.libvirt.driver [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.571 186180 DEBUG nova.virt.libvirt.vif [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:27:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-808493043',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-808493043',id=8,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:27:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='70ed1dbc47324f8890fd0ec8599a8f86',ramdisk_id='',reservation_id='r-ykc0caat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1263148546',owner_user_name='tempest-TestExecuteBasicStrategy-1263148546-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:27:46Z,user_data=None,user_id='e780e424f9f94938bff44671975be4ba',uuid=2653d20c-6ae2-4f6d-8d76-50640d70defd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "30b71dbc-27fb-4b14-90f6-296da61fd380", "address": "fa:16:3e:f8:7a:76", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap30b71dbc-27", "ovs_interfaceid": "30b71dbc-27fb-4b14-90f6-296da61fd380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.572 186180 DEBUG nova.network.os_vif_util [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "30b71dbc-27fb-4b14-90f6-296da61fd380", "address": "fa:16:3e:f8:7a:76", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap30b71dbc-27", "ovs_interfaceid": "30b71dbc-27fb-4b14-90f6-296da61fd380", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.572 186180 DEBUG nova.network.os_vif_util [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f8:7a:76,bridge_name='br-int',has_traffic_filtering=True,id=30b71dbc-27fb-4b14-90f6-296da61fd380,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b71dbc-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.573 186180 DEBUG os_vif [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:7a:76,bridge_name='br-int',has_traffic_filtering=True,id=30b71dbc-27fb-4b14-90f6-296da61fd380,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b71dbc-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.574 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.574 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.575 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.577 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.577 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30b71dbc-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.578 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap30b71dbc-27, col_values=(('external_ids', {'iface-id': '30b71dbc-27fb-4b14-90f6-296da61fd380', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f8:7a:76', 'vm-uuid': '2653d20c-6ae2-4f6d-8d76-50640d70defd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.579 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:37 compute-0 NetworkManager[56463]: <info>  [1771262917.5819] manager: (tap30b71dbc-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.582 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.586 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.587 186180 INFO os_vif [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f8:7a:76,bridge_name='br-int',has_traffic_filtering=True,id=30b71dbc-27fb-4b14-90f6-296da61fd380,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b71dbc-27')
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.588 186180 DEBUG nova.virt.libvirt.driver [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 17:28:37 compute-0 nova_compute[186176]: 2026-02-16 17:28:37.588 186180 DEBUG nova.compute.manager [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0cd9754f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2653d20c-6ae2-4f6d-8d76-50640d70defd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 17:28:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:38.157 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:38.158 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:38.159 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.345 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.346 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.346 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.347 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.421 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.494 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.496 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.547 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:28:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:38.665 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.665 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:38.666 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.744 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.746 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5697MB free_disk=73.19855499267578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.746 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.747 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.789 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Migration for instance 2653d20c-6ae2-4f6d-8d76-50640d70defd refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.808 186180 INFO nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Updating resource usage from migration 091f0c8c-7769-4bd3-857c-cfc10fe4f11d
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.809 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Starting to track incoming migration 091f0c8c-7769-4bd3-857c-cfc10fe4f11d with flavor 75ce9d90-876f-4652-a61c-f74d306b6692 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.842 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance 96178328-ed2e-49fe-b48b-c9cc5e9d509c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.858 186180 WARNING nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance 2653d20c-6ae2-4f6d-8d76-50640d70defd has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.859 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.859 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.913 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.930 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.951 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:28:38 compute-0 nova_compute[186176]: 2026-02-16 17:28:38.952 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:39 compute-0 podman[208700]: 2026-02-16 17:28:39.090515746 +0000 UTC m=+0.059911158 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:28:39 compute-0 podman[208699]: 2026-02-16 17:28:39.118016964 +0000 UTC m=+0.085250893 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true)
Feb 16 17:28:39 compute-0 nova_compute[186176]: 2026-02-16 17:28:39.267 186180 DEBUG nova.network.neutron [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Port 30b71dbc-27fb-4b14-90f6-296da61fd380 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 17:28:39 compute-0 nova_compute[186176]: 2026-02-16 17:28:39.269 186180 DEBUG nova.compute.manager [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp0cd9754f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='2653d20c-6ae2-4f6d-8d76-50640d70defd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 17:28:39 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 17:28:39 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 17:28:39 compute-0 kernel: tap30b71dbc-27: entered promiscuous mode
Feb 16 17:28:39 compute-0 NetworkManager[56463]: <info>  [1771262919.5506] manager: (tap30b71dbc-27): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Feb 16 17:28:39 compute-0 ovn_controller[96437]: 2026-02-16T17:28:39Z|00073|binding|INFO|Claiming lport 30b71dbc-27fb-4b14-90f6-296da61fd380 for this additional chassis.
Feb 16 17:28:39 compute-0 nova_compute[186176]: 2026-02-16 17:28:39.553 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:39 compute-0 ovn_controller[96437]: 2026-02-16T17:28:39Z|00074|binding|INFO|30b71dbc-27fb-4b14-90f6-296da61fd380: Claiming fa:16:3e:f8:7a:76 10.100.0.6
Feb 16 17:28:39 compute-0 nova_compute[186176]: 2026-02-16 17:28:39.562 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:39 compute-0 nova_compute[186176]: 2026-02-16 17:28:39.564 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:39 compute-0 ovn_controller[96437]: 2026-02-16T17:28:39Z|00075|binding|INFO|Setting lport 30b71dbc-27fb-4b14-90f6-296da61fd380 ovn-installed in OVS
Feb 16 17:28:39 compute-0 nova_compute[186176]: 2026-02-16 17:28:39.565 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:39 compute-0 nova_compute[186176]: 2026-02-16 17:28:39.567 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:39 compute-0 systemd-machined[155631]: New machine qemu-7-instance-00000008.
Feb 16 17:28:39 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000008.
Feb 16 17:28:39 compute-0 systemd-udevd[208784]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:28:39 compute-0 NetworkManager[56463]: <info>  [1771262919.6327] device (tap30b71dbc-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:28:39 compute-0 NetworkManager[56463]: <info>  [1771262919.6334] device (tap30b71dbc-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:28:39 compute-0 nova_compute[186176]: 2026-02-16 17:28:39.953 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:28:40 compute-0 nova_compute[186176]: 2026-02-16 17:28:40.646 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262920.6464705, 2653d20c-6ae2-4f6d-8d76-50640d70defd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:28:40 compute-0 nova_compute[186176]: 2026-02-16 17:28:40.648 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] VM Started (Lifecycle Event)
Feb 16 17:28:40 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:40.668 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:40 compute-0 nova_compute[186176]: 2026-02-16 17:28:40.692 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:28:41 compute-0 nova_compute[186176]: 2026-02-16 17:28:41.274 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771262921.2740908, 2653d20c-6ae2-4f6d-8d76-50640d70defd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:28:41 compute-0 nova_compute[186176]: 2026-02-16 17:28:41.275 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] VM Resumed (Lifecycle Event)
Feb 16 17:28:41 compute-0 nova_compute[186176]: 2026-02-16 17:28:41.304 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:28:41 compute-0 nova_compute[186176]: 2026-02-16 17:28:41.307 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:28:41 compute-0 nova_compute[186176]: 2026-02-16 17:28:41.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:28:41 compute-0 nova_compute[186176]: 2026-02-16 17:28:41.349 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 17:28:41 compute-0 nova_compute[186176]: 2026-02-16 17:28:41.460 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:42 compute-0 nova_compute[186176]: 2026-02-16 17:28:42.580 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:42 compute-0 ovn_controller[96437]: 2026-02-16T17:28:42Z|00076|binding|INFO|Claiming lport 30b71dbc-27fb-4b14-90f6-296da61fd380 for this chassis.
Feb 16 17:28:42 compute-0 ovn_controller[96437]: 2026-02-16T17:28:42Z|00077|binding|INFO|30b71dbc-27fb-4b14-90f6-296da61fd380: Claiming fa:16:3e:f8:7a:76 10.100.0.6
Feb 16 17:28:42 compute-0 ovn_controller[96437]: 2026-02-16T17:28:42Z|00078|binding|INFO|Setting lport 30b71dbc-27fb-4b14-90f6-296da61fd380 up in Southbound
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.657 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:7a:76 10.100.0.6'], port_security=['fa:16:3e:f8:7a:76 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2653d20c-6ae2-4f6d-8d76-50640d70defd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70ed1dbc47324f8890fd0ec8599a8f86', 'neutron:revision_number': '11', 'neutron:security_group_ids': '0a5399e9-f3d7-4e52-9a46-0a6482b7de76', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3aa309be-92d4-466f-ba5d-4012e3d564ca, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=30b71dbc-27fb-4b14-90f6-296da61fd380) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.660 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 30b71dbc-27fb-4b14-90f6-296da61fd380 in datapath c1062e90-d609-4ef6-8ee4-d67d0aa8101c bound to our chassis
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.662 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1062e90-d609-4ef6-8ee4-d67d0aa8101c
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.678 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b9a75808-ca60-4edb-abfb-6effe303fe17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.706 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[9a0bcec3-d775-4c53-9608-98e81badc43d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.711 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[aa65e551-0d21-49cd-b0fd-3afd4d16a3e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.734 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[76816892-b838-40da-9c1c-308868f755e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.748 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[07f62cff-2451-4a6d-b23b-0e4b8cb17314]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1062e90-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:31:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444544, 'reachable_time': 43617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208819, 'error': None, 'target': 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.759 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cfac1a-fdd8-47be-9568-79453e5c1e61]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc1062e90-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444555, 'tstamp': 444555}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208820, 'error': None, 'target': 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc1062e90-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444557, 'tstamp': 444557}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208820, 'error': None, 'target': 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.761 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1062e90-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:42 compute-0 nova_compute[186176]: 2026-02-16 17:28:42.763 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.764 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1062e90-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.765 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.765 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1062e90-d0, col_values=(('external_ids', {'iface-id': '7a5b24ec-e035-4a6a-afde-d2c30fbdd72e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:42.766 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:28:42 compute-0 nova_compute[186176]: 2026-02-16 17:28:42.794 186180 INFO nova.compute.manager [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Post operation of migration started
Feb 16 17:28:43 compute-0 nova_compute[186176]: 2026-02-16 17:28:43.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:28:43 compute-0 nova_compute[186176]: 2026-02-16 17:28:43.419 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-2653d20c-6ae2-4f6d-8d76-50640d70defd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:28:43 compute-0 nova_compute[186176]: 2026-02-16 17:28:43.420 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-2653d20c-6ae2-4f6d-8d76-50640d70defd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:28:43 compute-0 nova_compute[186176]: 2026-02-16 17:28:43.420 186180 DEBUG nova.network.neutron [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:28:44 compute-0 nova_compute[186176]: 2026-02-16 17:28:44.870 186180 DEBUG nova.network.neutron [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Updating instance_info_cache with network_info: [{"id": "30b71dbc-27fb-4b14-90f6-296da61fd380", "address": "fa:16:3e:f8:7a:76", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b71dbc-27", "ovs_interfaceid": "30b71dbc-27fb-4b14-90f6-296da61fd380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:28:44 compute-0 nova_compute[186176]: 2026-02-16 17:28:44.885 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-2653d20c-6ae2-4f6d-8d76-50640d70defd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:28:44 compute-0 nova_compute[186176]: 2026-02-16 17:28:44.904 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:44 compute-0 nova_compute[186176]: 2026-02-16 17:28:44.904 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:44 compute-0 nova_compute[186176]: 2026-02-16 17:28:44.905 186180 DEBUG oslo_concurrency.lockutils [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:44 compute-0 nova_compute[186176]: 2026-02-16 17:28:44.910 186180 INFO nova.virt.libvirt.driver [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 17:28:44 compute-0 virtqemud[185389]: Domain id=7 name='instance-00000008' uuid=2653d20c-6ae2-4f6d-8d76-50640d70defd is tainted: custom-monitor
Feb 16 17:28:45 compute-0 nova_compute[186176]: 2026-02-16 17:28:45.918 186180 INFO nova.virt.libvirt.driver [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 17:28:46 compute-0 nova_compute[186176]: 2026-02-16 17:28:46.463 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:46 compute-0 nova_compute[186176]: 2026-02-16 17:28:46.925 186180 INFO nova.virt.libvirt.driver [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 17:28:46 compute-0 nova_compute[186176]: 2026-02-16 17:28:46.929 186180 DEBUG nova.compute.manager [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:28:46 compute-0 nova_compute[186176]: 2026-02-16 17:28:46.947 186180 DEBUG nova.objects.instance [None req-cf42101e-5c9e-47fc-ac91-8e0a70ed8598 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 17:28:47 compute-0 nova_compute[186176]: 2026-02-16 17:28:47.583 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:51 compute-0 nova_compute[186176]: 2026-02-16 17:28:51.464 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.397 186180 DEBUG oslo_concurrency.lockutils [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "2653d20c-6ae2-4f6d-8d76-50640d70defd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.399 186180 DEBUG oslo_concurrency.lockutils [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "2653d20c-6ae2-4f6d-8d76-50640d70defd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.399 186180 DEBUG oslo_concurrency.lockutils [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "2653d20c-6ae2-4f6d-8d76-50640d70defd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.400 186180 DEBUG oslo_concurrency.lockutils [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "2653d20c-6ae2-4f6d-8d76-50640d70defd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.400 186180 DEBUG oslo_concurrency.lockutils [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "2653d20c-6ae2-4f6d-8d76-50640d70defd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.402 186180 INFO nova.compute.manager [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Terminating instance
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.404 186180 DEBUG nova.compute.manager [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:28:52 compute-0 kernel: tap30b71dbc-27 (unregistering): left promiscuous mode
Feb 16 17:28:52 compute-0 NetworkManager[56463]: <info>  [1771262932.4327] device (tap30b71dbc-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.442 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:52 compute-0 ovn_controller[96437]: 2026-02-16T17:28:52Z|00079|binding|INFO|Releasing lport 30b71dbc-27fb-4b14-90f6-296da61fd380 from this chassis (sb_readonly=0)
Feb 16 17:28:52 compute-0 ovn_controller[96437]: 2026-02-16T17:28:52Z|00080|binding|INFO|Setting lport 30b71dbc-27fb-4b14-90f6-296da61fd380 down in Southbound
Feb 16 17:28:52 compute-0 ovn_controller[96437]: 2026-02-16T17:28:52Z|00081|binding|INFO|Removing iface tap30b71dbc-27 ovn-installed in OVS
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.446 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.451 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.453 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f8:7a:76 10.100.0.6'], port_security=['fa:16:3e:f8:7a:76 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2653d20c-6ae2-4f6d-8d76-50640d70defd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70ed1dbc47324f8890fd0ec8599a8f86', 'neutron:revision_number': '13', 'neutron:security_group_ids': '0a5399e9-f3d7-4e52-9a46-0a6482b7de76', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3aa309be-92d4-466f-ba5d-4012e3d564ca, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=30b71dbc-27fb-4b14-90f6-296da61fd380) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.457 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 30b71dbc-27fb-4b14-90f6-296da61fd380 in datapath c1062e90-d609-4ef6-8ee4-d67d0aa8101c unbound from our chassis
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.459 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1062e90-d609-4ef6-8ee4-d67d0aa8101c
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.477 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f5653653-c95d-44e6-95ad-e15008b0f237]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:52 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Deactivated successfully.
Feb 16 17:28:52 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000008.scope: Consumed 1.939s CPU time.
Feb 16 17:28:52 compute-0 systemd-machined[155631]: Machine qemu-7-instance-00000008 terminated.
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.506 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[62b9430e-9cdb-459b-b0d5-dc10a12543db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.510 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[7d163698-77a9-443a-adc8-9ca3a73fc9e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.534 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[edfd91af-83b8-4522-a274-6ad82fea8af6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.556 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[a6260daf-b4c0-4ccd-bfdb-a523c3900482]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1062e90-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cf:31:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444544, 'reachable_time': 43617, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208847, 'error': None, 'target': 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.580 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[fd8cb185-8f7c-424d-8aef-2664ba237ce8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc1062e90-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444555, 'tstamp': 444555}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208848, 'error': None, 'target': 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc1062e90-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444557, 'tstamp': 444557}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208848, 'error': None, 'target': 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.582 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1062e90-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.583 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.584 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.588 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.588 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1062e90-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.589 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.589 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1062e90-d0, col_values=(('external_ids', {'iface-id': '7a5b24ec-e035-4a6a-afde-d2c30fbdd72e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:52.589 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.657 186180 INFO nova.virt.libvirt.driver [-] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Instance destroyed successfully.
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.658 186180 DEBUG nova.objects.instance [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lazy-loading 'resources' on Instance uuid 2653d20c-6ae2-4f6d-8d76-50640d70defd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.676 186180 DEBUG nova.virt.libvirt.vif [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T17:27:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-808493043',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-808493043',id=8,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:27:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70ed1dbc47324f8890fd0ec8599a8f86',ramdisk_id='',reservation_id='r-ykc0caat',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1263148546',owner_user_name='tempest-TestExecuteBasicStrategy-1263148546-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:28:46Z,user_data=None,user_id='e780e424f9f94938bff44671975be4ba',uuid=2653d20c-6ae2-4f6d-8d76-50640d70defd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "30b71dbc-27fb-4b14-90f6-296da61fd380", "address": "fa:16:3e:f8:7a:76", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b71dbc-27", "ovs_interfaceid": "30b71dbc-27fb-4b14-90f6-296da61fd380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.676 186180 DEBUG nova.network.os_vif_util [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Converting VIF {"id": "30b71dbc-27fb-4b14-90f6-296da61fd380", "address": "fa:16:3e:f8:7a:76", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap30b71dbc-27", "ovs_interfaceid": "30b71dbc-27fb-4b14-90f6-296da61fd380", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.677 186180 DEBUG nova.network.os_vif_util [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f8:7a:76,bridge_name='br-int',has_traffic_filtering=True,id=30b71dbc-27fb-4b14-90f6-296da61fd380,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b71dbc-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.678 186180 DEBUG os_vif [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:7a:76,bridge_name='br-int',has_traffic_filtering=True,id=30b71dbc-27fb-4b14-90f6-296da61fd380,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b71dbc-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.680 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.680 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30b71dbc-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.683 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.686 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.689 186180 INFO os_vif [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f8:7a:76,bridge_name='br-int',has_traffic_filtering=True,id=30b71dbc-27fb-4b14-90f6-296da61fd380,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap30b71dbc-27')
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.690 186180 INFO nova.virt.libvirt.driver [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Deleting instance files /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd_del
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.691 186180 INFO nova.virt.libvirt.driver [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Deletion of /var/lib/nova/instances/2653d20c-6ae2-4f6d-8d76-50640d70defd_del complete
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.745 186180 INFO nova.compute.manager [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Took 0.34 seconds to destroy the instance on the hypervisor.
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.745 186180 DEBUG oslo.service.loopingcall [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.746 186180 DEBUG nova.compute.manager [-] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:28:52 compute-0 nova_compute[186176]: 2026-02-16 17:28:52.746 186180 DEBUG nova.network.neutron [-] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.443 186180 DEBUG nova.compute.manager [req-cbf6b2cd-f311-471f-af29-d33ebf6704c6 req-ef8f347f-3b64-4842-bc17-6a1789b3aca0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Received event network-vif-unplugged-30b71dbc-27fb-4b14-90f6-296da61fd380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.444 186180 DEBUG oslo_concurrency.lockutils [req-cbf6b2cd-f311-471f-af29-d33ebf6704c6 req-ef8f347f-3b64-4842-bc17-6a1789b3aca0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "2653d20c-6ae2-4f6d-8d76-50640d70defd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.444 186180 DEBUG oslo_concurrency.lockutils [req-cbf6b2cd-f311-471f-af29-d33ebf6704c6 req-ef8f347f-3b64-4842-bc17-6a1789b3aca0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "2653d20c-6ae2-4f6d-8d76-50640d70defd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.444 186180 DEBUG oslo_concurrency.lockutils [req-cbf6b2cd-f311-471f-af29-d33ebf6704c6 req-ef8f347f-3b64-4842-bc17-6a1789b3aca0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "2653d20c-6ae2-4f6d-8d76-50640d70defd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.445 186180 DEBUG nova.compute.manager [req-cbf6b2cd-f311-471f-af29-d33ebf6704c6 req-ef8f347f-3b64-4842-bc17-6a1789b3aca0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] No waiting events found dispatching network-vif-unplugged-30b71dbc-27fb-4b14-90f6-296da61fd380 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.445 186180 DEBUG nova.compute.manager [req-cbf6b2cd-f311-471f-af29-d33ebf6704c6 req-ef8f347f-3b64-4842-bc17-6a1789b3aca0 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Received event network-vif-unplugged-30b71dbc-27fb-4b14-90f6-296da61fd380 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.643 186180 DEBUG nova.network.neutron [-] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.665 186180 INFO nova.compute.manager [-] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Took 1.92 seconds to deallocate network for instance.
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.708 186180 DEBUG oslo_concurrency.lockutils [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.708 186180 DEBUG oslo_concurrency.lockutils [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.716 186180 DEBUG oslo_concurrency.lockutils [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.750 186180 INFO nova.scheduler.client.report [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Deleted allocations for instance 2653d20c-6ae2-4f6d-8d76-50640d70defd
Feb 16 17:28:54 compute-0 nova_compute[186176]: 2026-02-16 17:28:54.830 186180 DEBUG oslo_concurrency.lockutils [None req-8796fb07-a2f4-4a0a-8749-597f69581ca4 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "2653d20c-6ae2-4f6d-8d76-50640d70defd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.640 186180 DEBUG oslo_concurrency.lockutils [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.640 186180 DEBUG oslo_concurrency.lockutils [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.641 186180 DEBUG oslo_concurrency.lockutils [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.641 186180 DEBUG oslo_concurrency.lockutils [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.641 186180 DEBUG oslo_concurrency.lockutils [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.643 186180 INFO nova.compute.manager [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Terminating instance
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.644 186180 DEBUG nova.compute.manager [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:28:55 compute-0 kernel: tap5879f7be-34 (unregistering): left promiscuous mode
Feb 16 17:28:55 compute-0 NetworkManager[56463]: <info>  [1771262935.6782] device (tap5879f7be-34): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.678 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:55 compute-0 ovn_controller[96437]: 2026-02-16T17:28:55Z|00082|binding|INFO|Releasing lport 5879f7be-34ad-4c93-92ab-de77c246915c from this chassis (sb_readonly=0)
Feb 16 17:28:55 compute-0 ovn_controller[96437]: 2026-02-16T17:28:55Z|00083|binding|INFO|Setting lport 5879f7be-34ad-4c93-92ab-de77c246915c down in Southbound
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.685 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:55 compute-0 ovn_controller[96437]: 2026-02-16T17:28:55Z|00084|binding|INFO|Removing iface tap5879f7be-34 ovn-installed in OVS
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.688 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:55.694 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:e9:4d 10.100.0.4'], port_security=['fa:16:3e:1e:e9:4d 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '96178328-ed2e-49fe-b48b-c9cc5e9d509c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '70ed1dbc47324f8890fd0ec8599a8f86', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0a5399e9-f3d7-4e52-9a46-0a6482b7de76', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3aa309be-92d4-466f-ba5d-4012e3d564ca, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=5879f7be-34ad-4c93-92ab-de77c246915c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.695 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:55.699 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 5879f7be-34ad-4c93-92ab-de77c246915c in datapath c1062e90-d609-4ef6-8ee4-d67d0aa8101c unbound from our chassis
Feb 16 17:28:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:55.703 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c1062e90-d609-4ef6-8ee4-d67d0aa8101c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:28:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:55.704 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[737648c0-cbd5-4990-a100-3f6636ee50f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:55.705 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c namespace which is not needed anymore
Feb 16 17:28:55 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000007.scope: Deactivated successfully.
Feb 16 17:28:55 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000007.scope: Consumed 15.139s CPU time.
Feb 16 17:28:55 compute-0 systemd-machined[155631]: Machine qemu-6-instance-00000007 terminated.
Feb 16 17:28:55 compute-0 neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c[208456]: [NOTICE]   (208460) : haproxy version is 2.8.14-c23fe91
Feb 16 17:28:55 compute-0 neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c[208456]: [NOTICE]   (208460) : path to executable is /usr/sbin/haproxy
Feb 16 17:28:55 compute-0 neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c[208456]: [WARNING]  (208460) : Exiting Master process...
Feb 16 17:28:55 compute-0 neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c[208456]: [ALERT]    (208460) : Current worker (208462) exited with code 143 (Terminated)
Feb 16 17:28:55 compute-0 neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c[208456]: [WARNING]  (208460) : All workers exited. Exiting... (0)
Feb 16 17:28:55 compute-0 systemd[1]: libpod-3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a.scope: Deactivated successfully.
Feb 16 17:28:55 compute-0 podman[208891]: 2026-02-16 17:28:55.849034398 +0000 UTC m=+0.048008274 container died 3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.864 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.868 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a-userdata-shm.mount: Deactivated successfully.
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.887 186180 INFO nova.virt.libvirt.driver [-] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Instance destroyed successfully.
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.888 186180 DEBUG nova.objects.instance [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lazy-loading 'resources' on Instance uuid 96178328-ed2e-49fe-b48b-c9cc5e9d509c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:28:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-d7a367084e238bd8d9c9b4c1c052b6a55c72be62b22d3cfb7b642e770e81ed6b-merged.mount: Deactivated successfully.
Feb 16 17:28:55 compute-0 podman[208891]: 2026-02-16 17:28:55.895328619 +0000 UTC m=+0.094302496 container cleanup 3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 17:28:55 compute-0 systemd[1]: libpod-conmon-3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a.scope: Deactivated successfully.
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.905 186180 DEBUG nova.virt.libvirt.vif [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:27:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-256910617',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-256910617',id=7,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:27:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='70ed1dbc47324f8890fd0ec8599a8f86',ramdisk_id='',reservation_id='r-wks0qpvt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-1263148546',owner_user_name='tempest-TestExecuteBasicStrategy-1263148546-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:27:30Z,user_data=None,user_id='e780e424f9f94938bff44671975be4ba',uuid=96178328-ed2e-49fe-b48b-c9cc5e9d509c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.905 186180 DEBUG nova.network.os_vif_util [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Converting VIF {"id": "5879f7be-34ad-4c93-92ab-de77c246915c", "address": "fa:16:3e:1e:e9:4d", "network": {"id": "c1062e90-d609-4ef6-8ee4-d67d0aa8101c", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-1800595780-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "70ed1dbc47324f8890fd0ec8599a8f86", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5879f7be-34", "ovs_interfaceid": "5879f7be-34ad-4c93-92ab-de77c246915c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.906 186180 DEBUG nova.network.os_vif_util [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:e9:4d,bridge_name='br-int',has_traffic_filtering=True,id=5879f7be-34ad-4c93-92ab-de77c246915c,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5879f7be-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.906 186180 DEBUG os_vif [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:e9:4d,bridge_name='br-int',has_traffic_filtering=True,id=5879f7be-34ad-4c93-92ab-de77c246915c,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5879f7be-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.908 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.908 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5879f7be-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.910 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.912 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.915 186180 INFO os_vif [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:e9:4d,bridge_name='br-int',has_traffic_filtering=True,id=5879f7be-34ad-4c93-92ab-de77c246915c,network=Network(c1062e90-d609-4ef6-8ee4-d67d0aa8101c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5879f7be-34')
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.916 186180 INFO nova.virt.libvirt.driver [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Deleting instance files /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c_del
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.916 186180 INFO nova.virt.libvirt.driver [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Deletion of /var/lib/nova/instances/96178328-ed2e-49fe-b48b-c9cc5e9d509c_del complete
Feb 16 17:28:55 compute-0 podman[208935]: 2026-02-16 17:28:55.960572688 +0000 UTC m=+0.040296484 container remove 3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.963 186180 INFO nova.compute.manager [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Took 0.32 seconds to destroy the instance on the hypervisor.
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.963 186180 DEBUG oslo.service.loopingcall [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.964 186180 DEBUG nova.compute.manager [-] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.964 186180 DEBUG nova.network.neutron [-] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:28:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:55.966 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[65f61f31-388f-4718-910a-24e002c8af1e]: (4, ('Mon Feb 16 05:28:55 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c (3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a)\n3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a\nMon Feb 16 05:28:55 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c (3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a)\n3c978410cd085adf204a60bf010e836c880b28fcff2fc5d41d6616487fa1f11a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:55.968 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[8c8840f9-bc7b-4bbe-953b-c1b5fffbca87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:55.969 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1062e90-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.971 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:55 compute-0 kernel: tapc1062e90-d0: left promiscuous mode
Feb 16 17:28:55 compute-0 nova_compute[186176]: 2026-02-16 17:28:55.980 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:55.983 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[9caada8d-a660-4023-a1a8-933338a84573]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:55.999 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[7bf9b245-c052-4660-a1a8-93f41a3eb990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:56.000 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4e17eb-e2b2-4332-a7b5-e384a648d959]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:56.014 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa9dd88-ce6c-4878-acb7-9ab33b7a4661]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444537, 'reachable_time': 33179, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208950, 'error': None, 'target': 'ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:56.018 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c1062e90-d609-4ef6-8ee4-d67d0aa8101c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:28:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:28:56.018 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[a351abc2-33fb-4c55-81e0-b95f98ac3e7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:28:56 compute-0 systemd[1]: run-netns-ovnmeta\x2dc1062e90\x2dd609\x2d4ef6\x2d8ee4\x2dd67d0aa8101c.mount: Deactivated successfully.
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.512 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.535 186180 DEBUG nova.compute.manager [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Received event network-vif-plugged-30b71dbc-27fb-4b14-90f6-296da61fd380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.536 186180 DEBUG oslo_concurrency.lockutils [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "2653d20c-6ae2-4f6d-8d76-50640d70defd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.537 186180 DEBUG oslo_concurrency.lockutils [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "2653d20c-6ae2-4f6d-8d76-50640d70defd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.537 186180 DEBUG oslo_concurrency.lockutils [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "2653d20c-6ae2-4f6d-8d76-50640d70defd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.537 186180 DEBUG nova.compute.manager [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] No waiting events found dispatching network-vif-plugged-30b71dbc-27fb-4b14-90f6-296da61fd380 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.537 186180 WARNING nova.compute.manager [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Received unexpected event network-vif-plugged-30b71dbc-27fb-4b14-90f6-296da61fd380 for instance with vm_state deleted and task_state None.
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.538 186180 DEBUG nova.compute.manager [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Received event network-vif-deleted-30b71dbc-27fb-4b14-90f6-296da61fd380 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.538 186180 DEBUG nova.compute.manager [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Received event network-vif-unplugged-5879f7be-34ad-4c93-92ab-de77c246915c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.538 186180 DEBUG oslo_concurrency.lockutils [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.539 186180 DEBUG oslo_concurrency.lockutils [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.539 186180 DEBUG oslo_concurrency.lockutils [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.539 186180 DEBUG nova.compute.manager [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] No waiting events found dispatching network-vif-unplugged-5879f7be-34ad-4c93-92ab-de77c246915c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:28:56 compute-0 nova_compute[186176]: 2026-02-16 17:28:56.540 186180 DEBUG nova.compute.manager [req-c76c2d68-14a8-44fb-bc93-2367a7cf236c req-0f261bcf-7137-4e21-a8bb-82a29a169066 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Received event network-vif-unplugged-5879f7be-34ad-4c93-92ab-de77c246915c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:28:57 compute-0 nova_compute[186176]: 2026-02-16 17:28:57.286 186180 DEBUG nova.network.neutron [-] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:28:57 compute-0 nova_compute[186176]: 2026-02-16 17:28:57.301 186180 INFO nova.compute.manager [-] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Took 1.34 seconds to deallocate network for instance.
Feb 16 17:28:57 compute-0 nova_compute[186176]: 2026-02-16 17:28:57.341 186180 DEBUG oslo_concurrency.lockutils [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:57 compute-0 nova_compute[186176]: 2026-02-16 17:28:57.341 186180 DEBUG oslo_concurrency.lockutils [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:57 compute-0 nova_compute[186176]: 2026-02-16 17:28:57.389 186180 DEBUG nova.compute.provider_tree [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:28:57 compute-0 nova_compute[186176]: 2026-02-16 17:28:57.419 186180 DEBUG nova.scheduler.client.report [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:28:57 compute-0 nova_compute[186176]: 2026-02-16 17:28:57.447 186180 DEBUG oslo_concurrency.lockutils [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:57 compute-0 nova_compute[186176]: 2026-02-16 17:28:57.505 186180 INFO nova.scheduler.client.report [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Deleted allocations for instance 96178328-ed2e-49fe-b48b-c9cc5e9d509c
Feb 16 17:28:57 compute-0 nova_compute[186176]: 2026-02-16 17:28:57.687 186180 DEBUG oslo_concurrency.lockutils [None req-b10ad8ba-cd9a-4620-885c-92eb4954a781 e780e424f9f94938bff44671975be4ba 70ed1dbc47324f8890fd0ec8599a8f86 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.047s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:58 compute-0 nova_compute[186176]: 2026-02-16 17:28:58.615 186180 DEBUG nova.compute.manager [req-5f64a7ce-8647-47cc-aeda-5af51a8a4100 req-9e6da103-0711-416b-8028-336c506114ec 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Received event network-vif-plugged-5879f7be-34ad-4c93-92ab-de77c246915c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:28:58 compute-0 nova_compute[186176]: 2026-02-16 17:28:58.616 186180 DEBUG oslo_concurrency.lockutils [req-5f64a7ce-8647-47cc-aeda-5af51a8a4100 req-9e6da103-0711-416b-8028-336c506114ec 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:28:58 compute-0 nova_compute[186176]: 2026-02-16 17:28:58.616 186180 DEBUG oslo_concurrency.lockutils [req-5f64a7ce-8647-47cc-aeda-5af51a8a4100 req-9e6da103-0711-416b-8028-336c506114ec 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:28:58 compute-0 nova_compute[186176]: 2026-02-16 17:28:58.616 186180 DEBUG oslo_concurrency.lockutils [req-5f64a7ce-8647-47cc-aeda-5af51a8a4100 req-9e6da103-0711-416b-8028-336c506114ec 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96178328-ed2e-49fe-b48b-c9cc5e9d509c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:28:58 compute-0 nova_compute[186176]: 2026-02-16 17:28:58.617 186180 DEBUG nova.compute.manager [req-5f64a7ce-8647-47cc-aeda-5af51a8a4100 req-9e6da103-0711-416b-8028-336c506114ec 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] No waiting events found dispatching network-vif-plugged-5879f7be-34ad-4c93-92ab-de77c246915c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:28:58 compute-0 nova_compute[186176]: 2026-02-16 17:28:58.617 186180 WARNING nova.compute.manager [req-5f64a7ce-8647-47cc-aeda-5af51a8a4100 req-9e6da103-0711-416b-8028-336c506114ec 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Received unexpected event network-vif-plugged-5879f7be-34ad-4c93-92ab-de77c246915c for instance with vm_state deleted and task_state None.
Feb 16 17:28:58 compute-0 nova_compute[186176]: 2026-02-16 17:28:58.617 186180 DEBUG nova.compute.manager [req-5f64a7ce-8647-47cc-aeda-5af51a8a4100 req-9e6da103-0711-416b-8028-336c506114ec 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Received event network-vif-deleted-5879f7be-34ad-4c93-92ab-de77c246915c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:28:59 compute-0 podman[195505]: time="2026-02-16T17:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:28:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:28:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2170 "" "Go-http-client/1.1"
Feb 16 17:29:00 compute-0 podman[208951]: 2026-02-16 17:29:00.11188831 +0000 UTC m=+0.073940044 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Feb 16 17:29:00 compute-0 nova_compute[186176]: 2026-02-16 17:29:00.912 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:01 compute-0 openstack_network_exporter[198360]: ERROR   17:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:29:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:29:01 compute-0 openstack_network_exporter[198360]: ERROR   17:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:29:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:29:01 compute-0 nova_compute[186176]: 2026-02-16 17:29:01.514 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:02 compute-0 podman[208974]: 2026-02-16 17:29:02.120406632 +0000 UTC m=+0.081981322 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 17:29:05 compute-0 nova_compute[186176]: 2026-02-16 17:29:05.915 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:06 compute-0 nova_compute[186176]: 2026-02-16 17:29:06.516 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:07 compute-0 nova_compute[186176]: 2026-02-16 17:29:07.656 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771262932.654639, 2653d20c-6ae2-4f6d-8d76-50640d70defd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:29:07 compute-0 nova_compute[186176]: 2026-02-16 17:29:07.657 186180 INFO nova.compute.manager [-] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] VM Stopped (Lifecycle Event)
Feb 16 17:29:07 compute-0 nova_compute[186176]: 2026-02-16 17:29:07.677 186180 DEBUG nova.compute.manager [None req-68c52334-cd3f-43fe-bd04-026187a9bdd2 - - - - - -] [instance: 2653d20c-6ae2-4f6d-8d76-50640d70defd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:29:10 compute-0 podman[208995]: 2026-02-16 17:29:10.118195131 +0000 UTC m=+0.075858041 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:29:10 compute-0 podman[208994]: 2026-02-16 17:29:10.158228958 +0000 UTC m=+0.121716602 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 17:29:10 compute-0 nova_compute[186176]: 2026-02-16 17:29:10.887 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771262935.8863711, 96178328-ed2e-49fe-b48b-c9cc5e9d509c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:29:10 compute-0 nova_compute[186176]: 2026-02-16 17:29:10.887 186180 INFO nova.compute.manager [-] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] VM Stopped (Lifecycle Event)
Feb 16 17:29:10 compute-0 nova_compute[186176]: 2026-02-16 17:29:10.913 186180 DEBUG nova.compute.manager [None req-ad155d84-e99e-465f-99f9-4f2ea4b8df4b - - - - - -] [instance: 96178328-ed2e-49fe-b48b-c9cc5e9d509c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:29:10 compute-0 nova_compute[186176]: 2026-02-16 17:29:10.918 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:11 compute-0 nova_compute[186176]: 2026-02-16 17:29:11.519 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:15 compute-0 nova_compute[186176]: 2026-02-16 17:29:15.920 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:16 compute-0 nova_compute[186176]: 2026-02-16 17:29:16.522 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:20 compute-0 nova_compute[186176]: 2026-02-16 17:29:20.964 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:21 compute-0 nova_compute[186176]: 2026-02-16 17:29:21.433 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:21 compute-0 nova_compute[186176]: 2026-02-16 17:29:21.524 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:26 compute-0 nova_compute[186176]: 2026-02-16 17:29:26.006 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:26 compute-0 nova_compute[186176]: 2026-02-16 17:29:26.525 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:29 compute-0 podman[195505]: time="2026-02-16T17:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:29:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:29:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 17:29:31 compute-0 nova_compute[186176]: 2026-02-16 17:29:31.010 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:31 compute-0 podman[209043]: 2026-02-16 17:29:31.093999844 +0000 UTC m=+0.066397754 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 17:29:31 compute-0 openstack_network_exporter[198360]: ERROR   17:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:29:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:29:31 compute-0 openstack_network_exporter[198360]: ERROR   17:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:29:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:29:31 compute-0 nova_compute[186176]: 2026-02-16 17:29:31.527 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:33 compute-0 podman[209064]: 2026-02-16 17:29:33.107165684 +0000 UTC m=+0.065069821 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:29:33 compute-0 nova_compute[186176]: 2026-02-16 17:29:33.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:29:33 compute-0 nova_compute[186176]: 2026-02-16 17:29:33.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:29:33 compute-0 nova_compute[186176]: 2026-02-16 17:29:33.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:29:33 compute-0 nova_compute[186176]: 2026-02-16 17:29:33.333 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:29:35 compute-0 nova_compute[186176]: 2026-02-16 17:29:35.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:29:35 compute-0 nova_compute[186176]: 2026-02-16 17:29:35.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:29:35 compute-0 nova_compute[186176]: 2026-02-16 17:29:35.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:29:36 compute-0 nova_compute[186176]: 2026-02-16 17:29:36.014 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:29:36.532 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:29:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:29:36.533 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:29:36 compute-0 nova_compute[186176]: 2026-02-16 17:29:36.552 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:29:38.159 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:29:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:29:38.160 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:29:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:29:38.161 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.341 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.342 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.564 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.567 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5861MB free_disk=73.22416305541992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.568 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.568 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.635 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.636 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.652 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.672 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.672 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.693 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.712 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.736 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.752 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.791 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:29:39 compute-0 nova_compute[186176]: 2026-02-16 17:29:39.792 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:29:40 compute-0 nova_compute[186176]: 2026-02-16 17:29:40.793 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:29:41 compute-0 nova_compute[186176]: 2026-02-16 17:29:41.066 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:41 compute-0 podman[209085]: 2026-02-16 17:29:41.120715543 +0000 UTC m=+0.091128117 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:29:41 compute-0 podman[209098]: 2026-02-16 17:29:41.191267419 +0000 UTC m=+0.103748409 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:29:41 compute-0 nova_compute[186176]: 2026-02-16 17:29:41.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:29:41 compute-0 nova_compute[186176]: 2026-02-16 17:29:41.554 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:43 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:29:43.536 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:29:45 compute-0 nova_compute[186176]: 2026-02-16 17:29:45.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:29:46 compute-0 nova_compute[186176]: 2026-02-16 17:29:46.070 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:46 compute-0 nova_compute[186176]: 2026-02-16 17:29:46.556 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:51 compute-0 nova_compute[186176]: 2026-02-16 17:29:51.074 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:51 compute-0 nova_compute[186176]: 2026-02-16 17:29:51.559 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:54 compute-0 ovn_controller[96437]: 2026-02-16T17:29:54Z|00085|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Feb 16 17:29:56 compute-0 nova_compute[186176]: 2026-02-16 17:29:56.078 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:56 compute-0 nova_compute[186176]: 2026-02-16 17:29:56.569 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:29:59 compute-0 podman[195505]: time="2026-02-16T17:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:29:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:29:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Feb 16 17:30:01 compute-0 nova_compute[186176]: 2026-02-16 17:30:01.079 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:01 compute-0 openstack_network_exporter[198360]: ERROR   17:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:30:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:30:01 compute-0 openstack_network_exporter[198360]: ERROR   17:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:30:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:30:01 compute-0 nova_compute[186176]: 2026-02-16 17:30:01.571 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:02 compute-0 podman[209136]: 2026-02-16 17:30:02.089507793 +0000 UTC m=+0.058491468 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, managed_by=edpm_ansible, release=1770267347, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 16 17:30:04 compute-0 podman[209157]: 2026-02-16 17:30:04.113979231 +0000 UTC m=+0.077333084 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 17:30:06 compute-0 nova_compute[186176]: 2026-02-16 17:30:06.082 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:06 compute-0 nova_compute[186176]: 2026-02-16 17:30:06.572 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:11 compute-0 nova_compute[186176]: 2026-02-16 17:30:11.086 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:11 compute-0 nova_compute[186176]: 2026-02-16 17:30:11.612 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:12 compute-0 podman[209177]: 2026-02-16 17:30:12.10889626 +0000 UTC m=+0.068416184 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:30:12 compute-0 podman[209176]: 2026-02-16 17:30:12.14001292 +0000 UTC m=+0.104849806 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller)
Feb 16 17:30:16 compute-0 nova_compute[186176]: 2026-02-16 17:30:16.089 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:16 compute-0 nova_compute[186176]: 2026-02-16 17:30:16.613 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:21 compute-0 nova_compute[186176]: 2026-02-16 17:30:21.128 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:21 compute-0 nova_compute[186176]: 2026-02-16 17:30:21.615 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:26 compute-0 nova_compute[186176]: 2026-02-16 17:30:26.132 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:26 compute-0 nova_compute[186176]: 2026-02-16 17:30:26.640 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:29 compute-0 podman[195505]: time="2026-02-16T17:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:30:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:30:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 16 17:30:31 compute-0 nova_compute[186176]: 2026-02-16 17:30:31.152 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:31 compute-0 openstack_network_exporter[198360]: ERROR   17:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:30:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:30:31 compute-0 openstack_network_exporter[198360]: ERROR   17:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:30:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:30:31 compute-0 nova_compute[186176]: 2026-02-16 17:30:31.641 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:33 compute-0 podman[209226]: 2026-02-16 17:30:33.074250694 +0000 UTC m=+0.051030204 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=openstack_network_exporter, version=9.7, architecture=x86_64, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 17:30:34 compute-0 nova_compute[186176]: 2026-02-16 17:30:34.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:30:34 compute-0 nova_compute[186176]: 2026-02-16 17:30:34.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:30:34 compute-0 nova_compute[186176]: 2026-02-16 17:30:34.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:30:34 compute-0 nova_compute[186176]: 2026-02-16 17:30:34.330 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:30:35 compute-0 podman[209249]: 2026-02-16 17:30:35.127315922 +0000 UTC m=+0.096421027 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 17:30:36 compute-0 nova_compute[186176]: 2026-02-16 17:30:36.157 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:36 compute-0 nova_compute[186176]: 2026-02-16 17:30:36.642 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:37 compute-0 nova_compute[186176]: 2026-02-16 17:30:37.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:30:37 compute-0 nova_compute[186176]: 2026-02-16 17:30:37.334 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:30:37 compute-0 nova_compute[186176]: 2026-02-16 17:30:37.334 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:30:37 compute-0 nova_compute[186176]: 2026-02-16 17:30:37.335 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:30:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:38.161 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:38.162 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:38.162 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.343 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.343 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.344 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.344 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.509 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.511 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5871MB free_disk=73.22395324707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.511 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.512 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.579 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.579 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.667 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.682 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.686 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:30:40 compute-0 nova_compute[186176]: 2026-02-16 17:30:40.686 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:41 compute-0 nova_compute[186176]: 2026-02-16 17:30:41.161 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:41 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 17:30:41 compute-0 nova_compute[186176]: 2026-02-16 17:30:41.644 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:41 compute-0 nova_compute[186176]: 2026-02-16 17:30:41.687 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:30:42 compute-0 nova_compute[186176]: 2026-02-16 17:30:42.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:30:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:42.490 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:30:42 compute-0 nova_compute[186176]: 2026-02-16 17:30:42.491 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:42.491 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:30:43 compute-0 podman[209270]: 2026-02-16 17:30:43.107375122 +0000 UTC m=+0.070742791 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:30:43 compute-0 podman[209269]: 2026-02-16 17:30:43.116857097 +0000 UTC m=+0.091065274 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:30:46 compute-0 nova_compute[186176]: 2026-02-16 17:30:46.199 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:46 compute-0 nova_compute[186176]: 2026-02-16 17:30:46.647 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.319 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.377 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.377 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.396 186180 DEBUG nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.473 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.474 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.483 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.484 186180 INFO nova.compute.claims [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.593 186180 DEBUG nova.compute.provider_tree [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.613 186180 DEBUG nova.scheduler.client.report [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.636 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.636 186180 DEBUG nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.677 186180 DEBUG nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.678 186180 DEBUG nova.network.neutron [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.695 186180 INFO nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.716 186180 DEBUG nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.811 186180 DEBUG nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.813 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.813 186180 INFO nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Creating image(s)
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.814 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "/var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.814 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "/var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.815 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "/var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.833 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.890 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.891 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.891 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.906 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.959 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.961 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.989 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.991 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:47 compute-0 nova_compute[186176]: 2026-02-16 17:30:47.991 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.070 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.071 186180 DEBUG nova.virt.disk.api [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Checking if we can resize image /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.071 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.135 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.136 186180 DEBUG nova.virt.disk.api [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Cannot resize image /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.137 186180 DEBUG nova.objects.instance [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lazy-loading 'migration_context' on Instance uuid 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.158 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.158 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Ensure instance console log exists: /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.159 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.159 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.160 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.318 186180 DEBUG nova.policy [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '468cc4ab56ec477b890d2e4cc38a2ddc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6fb6560dd6834661a01ab8901000d6ac', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:30:48 compute-0 nova_compute[186176]: 2026-02-16 17:30:48.794 186180 DEBUG nova.network.neutron [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Successfully created port: 25663d0b-d0ac-42cb-aa23-764bc38359f3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:30:49 compute-0 nova_compute[186176]: 2026-02-16 17:30:49.441 186180 DEBUG nova.network.neutron [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Successfully updated port: 25663d0b-d0ac-42cb-aa23-764bc38359f3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:30:49 compute-0 nova_compute[186176]: 2026-02-16 17:30:49.458 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "refresh_cache-5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:30:49 compute-0 nova_compute[186176]: 2026-02-16 17:30:49.458 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquired lock "refresh_cache-5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:30:49 compute-0 nova_compute[186176]: 2026-02-16 17:30:49.459 186180 DEBUG nova.network.neutron [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:30:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:49.493 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:30:49 compute-0 nova_compute[186176]: 2026-02-16 17:30:49.503 186180 DEBUG nova.compute.manager [req-06696920-6e5a-440c-aaf1-50258d66851f req-db8e9435-91d8-4ba3-82e5-1bfd981700fc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Received event network-changed-25663d0b-d0ac-42cb-aa23-764bc38359f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:30:49 compute-0 nova_compute[186176]: 2026-02-16 17:30:49.504 186180 DEBUG nova.compute.manager [req-06696920-6e5a-440c-aaf1-50258d66851f req-db8e9435-91d8-4ba3-82e5-1bfd981700fc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Refreshing instance network info cache due to event network-changed-25663d0b-d0ac-42cb-aa23-764bc38359f3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:30:49 compute-0 nova_compute[186176]: 2026-02-16 17:30:49.504 186180 DEBUG oslo_concurrency.lockutils [req-06696920-6e5a-440c-aaf1-50258d66851f req-db8e9435-91d8-4ba3-82e5-1bfd981700fc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:30:49 compute-0 nova_compute[186176]: 2026-02-16 17:30:49.578 186180 DEBUG nova.network.neutron [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.525 186180 DEBUG nova.network.neutron [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Updating instance_info_cache with network_info: [{"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.554 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Releasing lock "refresh_cache-5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.555 186180 DEBUG nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Instance network_info: |[{"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.555 186180 DEBUG oslo_concurrency.lockutils [req-06696920-6e5a-440c-aaf1-50258d66851f req-db8e9435-91d8-4ba3-82e5-1bfd981700fc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.556 186180 DEBUG nova.network.neutron [req-06696920-6e5a-440c-aaf1-50258d66851f req-db8e9435-91d8-4ba3-82e5-1bfd981700fc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Refreshing network info cache for port 25663d0b-d0ac-42cb-aa23-764bc38359f3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.558 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Start _get_guest_xml network_info=[{"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.562 186180 WARNING nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.570 186180 DEBUG nova.virt.libvirt.host [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.571 186180 DEBUG nova.virt.libvirt.host [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.580 186180 DEBUG nova.virt.libvirt.host [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.581 186180 DEBUG nova.virt.libvirt.host [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.582 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.582 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.582 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.583 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.583 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.583 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.583 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.583 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.584 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.584 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.584 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.584 186180 DEBUG nova.virt.hardware [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.587 186180 DEBUG nova.virt.libvirt.vif [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:30:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-924773153',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-924773153',id=10,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6fb6560dd6834661a01ab8901000d6ac',ramdisk_id='',reservation_id='r-npsi4q2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-810545833',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-810545833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:30:47Z,user_data=None,user_id='468cc4ab56ec477b890d2e4cc38a2ddc',uuid=5df43f57-f8b3-4438-8e7d-f30c4bb4ae81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.588 186180 DEBUG nova.network.os_vif_util [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Converting VIF {"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.589 186180 DEBUG nova.network.os_vif_util [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:07:86,bridge_name='br-int',has_traffic_filtering=True,id=25663d0b-d0ac-42cb-aa23-764bc38359f3,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25663d0b-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.589 186180 DEBUG nova.objects.instance [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lazy-loading 'pci_devices' on Instance uuid 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.614 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <uuid>5df43f57-f8b3-4438-8e7d-f30c4bb4ae81</uuid>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <name>instance-0000000a</name>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-924773153</nova:name>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:30:50</nova:creationTime>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:30:50 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:30:50 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:30:50 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:30:50 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:30:50 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:30:50 compute-0 nova_compute[186176]:         <nova:user uuid="468cc4ab56ec477b890d2e4cc38a2ddc">tempest-TestExecuteHostMaintenanceStrategy-810545833-project-member</nova:user>
Feb 16 17:30:50 compute-0 nova_compute[186176]:         <nova:project uuid="6fb6560dd6834661a01ab8901000d6ac">tempest-TestExecuteHostMaintenanceStrategy-810545833</nova:project>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:30:50 compute-0 nova_compute[186176]:         <nova:port uuid="25663d0b-d0ac-42cb-aa23-764bc38359f3">
Feb 16 17:30:50 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <system>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <entry name="serial">5df43f57-f8b3-4438-8e7d-f30c4bb4ae81</entry>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <entry name="uuid">5df43f57-f8b3-4438-8e7d-f30c4bb4ae81</entry>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     </system>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <os>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   </os>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <features>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   </features>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk.config"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:22:07:86"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <target dev="tap25663d0b-d0"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/console.log" append="off"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <video>
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     </video>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:30:50 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:30:50 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:30:50 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:30:50 compute-0 nova_compute[186176]: </domain>
Feb 16 17:30:50 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.615 186180 DEBUG nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Preparing to wait for external event network-vif-plugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.616 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.616 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.616 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.617 186180 DEBUG nova.virt.libvirt.vif [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:30:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-924773153',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-924773153',id=10,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6fb6560dd6834661a01ab8901000d6ac',ramdisk_id='',reservation_id='r-npsi4q2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-810545833',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-810545833-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:30:47Z,user_data=None,user_id='468cc4ab56ec477b890d2e4cc38a2ddc',uuid=5df43f57-f8b3-4438-8e7d-f30c4bb4ae81,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.617 186180 DEBUG nova.network.os_vif_util [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Converting VIF {"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.617 186180 DEBUG nova.network.os_vif_util [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:07:86,bridge_name='br-int',has_traffic_filtering=True,id=25663d0b-d0ac-42cb-aa23-764bc38359f3,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25663d0b-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.618 186180 DEBUG os_vif [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:07:86,bridge_name='br-int',has_traffic_filtering=True,id=25663d0b-d0ac-42cb-aa23-764bc38359f3,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25663d0b-d0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.618 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.619 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.619 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.624 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.625 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25663d0b-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.625 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap25663d0b-d0, col_values=(('external_ids', {'iface-id': '25663d0b-d0ac-42cb-aa23-764bc38359f3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:07:86', 'vm-uuid': '5df43f57-f8b3-4438-8e7d-f30c4bb4ae81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.627 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.629 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:30:50 compute-0 NetworkManager[56463]: <info>  [1771263050.6295] manager: (tap25663d0b-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.635 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.636 186180 INFO os_vif [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:07:86,bridge_name='br-int',has_traffic_filtering=True,id=25663d0b-d0ac-42cb-aa23-764bc38359f3,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25663d0b-d0')
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.720 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.721 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.722 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] No VIF found with MAC fa:16:3e:22:07:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:30:50 compute-0 nova_compute[186176]: 2026-02-16 17:30:50.723 186180 INFO nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Using config drive
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.319 186180 INFO nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Creating config drive at /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk.config
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.325 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6ugo1blv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.442 186180 DEBUG oslo_concurrency.processutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6ugo1blv" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:30:51 compute-0 kernel: tap25663d0b-d0: entered promiscuous mode
Feb 16 17:30:51 compute-0 NetworkManager[56463]: <info>  [1771263051.4981] manager: (tap25663d0b-d0): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Feb 16 17:30:51 compute-0 ovn_controller[96437]: 2026-02-16T17:30:51Z|00086|binding|INFO|Claiming lport 25663d0b-d0ac-42cb-aa23-764bc38359f3 for this chassis.
Feb 16 17:30:51 compute-0 ovn_controller[96437]: 2026-02-16T17:30:51Z|00087|binding|INFO|25663d0b-d0ac-42cb-aa23-764bc38359f3: Claiming fa:16:3e:22:07:86 10.100.0.12
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.500 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.509 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.522 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:51 compute-0 ovn_controller[96437]: 2026-02-16T17:30:51Z|00088|binding|INFO|Setting lport 25663d0b-d0ac-42cb-aa23-764bc38359f3 ovn-installed in OVS
Feb 16 17:30:51 compute-0 ovn_controller[96437]: 2026-02-16T17:30:51Z|00089|binding|INFO|Setting lport 25663d0b-d0ac-42cb-aa23-764bc38359f3 up in Southbound
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.521 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:07:86 10.100.0.12'], port_security=['fa:16:3e:22:07:86 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5df43f57-f8b3-4438-8e7d-f30c4bb4ae81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b39b2c0-1083-4978-979e-9968d58a02ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6fb6560dd6834661a01ab8901000d6ac', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1fdb7134-50ed-4080-a7be-cbd5f4bce078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6db313f-6dc7-424c-a88a-3aebdc385cca, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=25663d0b-d0ac-42cb-aa23-764bc38359f3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.523 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 25663d0b-d0ac-42cb-aa23-764bc38359f3 in datapath 4b39b2c0-1083-4978-979e-9968d58a02ef bound to our chassis
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.525 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b39b2c0-1083-4978-979e-9968d58a02ef
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.527 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:51 compute-0 systemd-machined[155631]: New machine qemu-8-instance-0000000a.
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.535 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[8bacc358-d030-4f2c-ae0a-ecda8694679d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.537 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4b39b2c0-11 in ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.539 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4b39b2c0-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.539 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[095d0c2d-f64d-4899-8ff2-f3213e7e56d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.540 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[07c58aa6-c595-4722-bec0-a673eacf06ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000a.
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.551 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[30cccead-92dc-4eef-a7ee-6c6fa74d4397]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 systemd-udevd[209355]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.563 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[424a9113-00ee-4593-9061-11a936473e06]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 NetworkManager[56463]: <info>  [1771263051.5689] device (tap25663d0b-d0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:30:51 compute-0 NetworkManager[56463]: <info>  [1771263051.5703] device (tap25663d0b-d0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.587 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[0c22680d-6c98-493e-8427-93cbfda2df0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.592 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b57fface-12b9-469b-be72-e4715d03a15b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 NetworkManager[56463]: <info>  [1771263051.5943] manager: (tap4b39b2c0-10): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Feb 16 17:30:51 compute-0 systemd-udevd[209359]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.627 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[46abe55c-8107-4b7f-9fd1-94c0feaaaa53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.631 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf5041d-fc51-4d25-adcf-79392f9c7ced]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.648 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:51 compute-0 NetworkManager[56463]: <info>  [1771263051.6552] device (tap4b39b2c0-10): carrier: link connected
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.661 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[2662ffe2-eb31-42af-96bc-68d390fbc697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.677 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[28f1e079-8445-43f5-b89a-1ab55e8d2075]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b39b2c0-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b7:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464718, 'reachable_time': 25199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209386, 'error': None, 'target': 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.696 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[2da2abe5-5807-47a4-b75c-b0afb88c48b3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:b7e7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464718, 'tstamp': 464718}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209388, 'error': None, 'target': 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.710 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ffdf2c74-4b01-4d95-894d-41a836fa981e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b39b2c0-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b7:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464718, 'reachable_time': 25199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209390, 'error': None, 'target': 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.739 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[84f1c8ac-1c63-4d33-8d69-41f5c6cc215b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.741 186180 DEBUG nova.compute.manager [req-c311be8c-e041-4c95-8d79-76eaed273e92 req-cd32d734-960c-40f0-9114-3e167ba873d1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Received event network-vif-plugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.742 186180 DEBUG oslo_concurrency.lockutils [req-c311be8c-e041-4c95-8d79-76eaed273e92 req-cd32d734-960c-40f0-9114-3e167ba873d1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.742 186180 DEBUG oslo_concurrency.lockutils [req-c311be8c-e041-4c95-8d79-76eaed273e92 req-cd32d734-960c-40f0-9114-3e167ba873d1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.742 186180 DEBUG oslo_concurrency.lockutils [req-c311be8c-e041-4c95-8d79-76eaed273e92 req-cd32d734-960c-40f0-9114-3e167ba873d1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.743 186180 DEBUG nova.compute.manager [req-c311be8c-e041-4c95-8d79-76eaed273e92 req-cd32d734-960c-40f0-9114-3e167ba873d1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Processing event network-vif-plugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.804 186180 DEBUG nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.804 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263051.803286, 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.805 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] VM Started (Lifecycle Event)
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.808 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3774f1d1-75e4-40bb-815d-480d4ad46206]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.810 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b39b2c0-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.811 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.811 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.812 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b39b2c0-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:30:51 compute-0 NetworkManager[56463]: <info>  [1771263051.8171] manager: (tap4b39b2c0-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Feb 16 17:30:51 compute-0 kernel: tap4b39b2c0-10: entered promiscuous mode
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.819 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.821 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b39b2c0-10, col_values=(('external_ids', {'iface-id': 'cfb769f4-e04b-42dd-bc9b-2d451cb78490'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:30:51 compute-0 ovn_controller[96437]: 2026-02-16T17:30:51Z|00090|binding|INFO|Releasing lport cfb769f4-e04b-42dd-bc9b-2d451cb78490 from this chassis (sb_readonly=0)
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.823 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.828 186180 INFO nova.virt.libvirt.driver [-] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Instance spawned successfully.
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.829 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.831 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.833 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4b39b2c0-1083-4978-979e-9968d58a02ef.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4b39b2c0-1083-4978-979e-9968d58a02ef.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.834 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[1743c926-1b54-43c4-94b6-853176db82a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.835 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-4b39b2c0-1083-4978-979e-9968d58a02ef
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/4b39b2c0-1083-4978-979e-9968d58a02ef.pid.haproxy
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 4b39b2c0-1083-4978-979e-9968d58a02ef
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:30:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:30:51.836 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'env', 'PROCESS_TAG=haproxy-4b39b2c0-1083-4978-979e-9968d58a02ef', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4b39b2c0-1083-4978-979e-9968d58a02ef.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.837 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.842 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.877 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.877 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263051.8036032, 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.878 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] VM Paused (Lifecycle Event)
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.884 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.885 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.885 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.886 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.886 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.887 186180 DEBUG nova.virt.libvirt.driver [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.895 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.900 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263051.8087776, 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.901 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] VM Resumed (Lifecycle Event)
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.924 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.929 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.952 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.973 186180 INFO nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Took 4.16 seconds to spawn the instance on the hypervisor.
Feb 16 17:30:51 compute-0 nova_compute[186176]: 2026-02-16 17:30:51.974 186180 DEBUG nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:30:52 compute-0 nova_compute[186176]: 2026-02-16 17:30:52.098 186180 INFO nova.compute.manager [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Took 4.65 seconds to build instance.
Feb 16 17:30:52 compute-0 nova_compute[186176]: 2026-02-16 17:30:52.118 186180 DEBUG oslo_concurrency.lockutils [None req-a2efb85c-221a-4e05-97d3-88224e04dbf7 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.741s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:52 compute-0 podman[209427]: 2026-02-16 17:30:52.29028398 +0000 UTC m=+0.081065987 container create 660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 16 17:30:52 compute-0 systemd[1]: Started libpod-conmon-660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5.scope.
Feb 16 17:30:52 compute-0 podman[209427]: 2026-02-16 17:30:52.24784328 +0000 UTC m=+0.038625327 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:30:52 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:30:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa803492633eed2947bb7a6f42389ada3e0a42ef6669c2552be4ba0f7be67fb1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:30:52 compute-0 podman[209427]: 2026-02-16 17:30:52.37755254 +0000 UTC m=+0.168334517 container init 660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:30:52 compute-0 podman[209427]: 2026-02-16 17:30:52.384188804 +0000 UTC m=+0.174970811 container start 660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:30:52 compute-0 nova_compute[186176]: 2026-02-16 17:30:52.398 186180 DEBUG nova.network.neutron [req-06696920-6e5a-440c-aaf1-50258d66851f req-db8e9435-91d8-4ba3-82e5-1bfd981700fc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Updated VIF entry in instance network info cache for port 25663d0b-d0ac-42cb-aa23-764bc38359f3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:30:52 compute-0 nova_compute[186176]: 2026-02-16 17:30:52.399 186180 DEBUG nova.network.neutron [req-06696920-6e5a-440c-aaf1-50258d66851f req-db8e9435-91d8-4ba3-82e5-1bfd981700fc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Updating instance_info_cache with network_info: [{"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:30:52 compute-0 neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef[209442]: [NOTICE]   (209446) : New worker (209448) forked
Feb 16 17:30:52 compute-0 neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef[209442]: [NOTICE]   (209446) : Loading success.
Feb 16 17:30:52 compute-0 nova_compute[186176]: 2026-02-16 17:30:52.418 186180 DEBUG oslo_concurrency.lockutils [req-06696920-6e5a-440c-aaf1-50258d66851f req-db8e9435-91d8-4ba3-82e5-1bfd981700fc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:30:54 compute-0 nova_compute[186176]: 2026-02-16 17:30:54.075 186180 DEBUG nova.compute.manager [req-927742ae-9b5f-4ac0-9a99-c8e500a8627c req-23d27105-5ff6-41d6-b113-294357b94b8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Received event network-vif-plugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:30:54 compute-0 nova_compute[186176]: 2026-02-16 17:30:54.076 186180 DEBUG oslo_concurrency.lockutils [req-927742ae-9b5f-4ac0-9a99-c8e500a8627c req-23d27105-5ff6-41d6-b113-294357b94b8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:30:54 compute-0 nova_compute[186176]: 2026-02-16 17:30:54.076 186180 DEBUG oslo_concurrency.lockutils [req-927742ae-9b5f-4ac0-9a99-c8e500a8627c req-23d27105-5ff6-41d6-b113-294357b94b8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:30:54 compute-0 nova_compute[186176]: 2026-02-16 17:30:54.077 186180 DEBUG oslo_concurrency.lockutils [req-927742ae-9b5f-4ac0-9a99-c8e500a8627c req-23d27105-5ff6-41d6-b113-294357b94b8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:30:54 compute-0 nova_compute[186176]: 2026-02-16 17:30:54.077 186180 DEBUG nova.compute.manager [req-927742ae-9b5f-4ac0-9a99-c8e500a8627c req-23d27105-5ff6-41d6-b113-294357b94b8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] No waiting events found dispatching network-vif-plugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:30:54 compute-0 nova_compute[186176]: 2026-02-16 17:30:54.077 186180 WARNING nova.compute.manager [req-927742ae-9b5f-4ac0-9a99-c8e500a8627c req-23d27105-5ff6-41d6-b113-294357b94b8a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Received unexpected event network-vif-plugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 for instance with vm_state active and task_state None.
Feb 16 17:30:55 compute-0 nova_compute[186176]: 2026-02-16 17:30:55.629 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:56 compute-0 nova_compute[186176]: 2026-02-16 17:30:56.651 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:30:59 compute-0 podman[195505]: time="2026-02-16T17:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:30:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:30:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2632 "" "Go-http-client/1.1"
Feb 16 17:31:00 compute-0 nova_compute[186176]: 2026-02-16 17:31:00.632 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:01 compute-0 openstack_network_exporter[198360]: ERROR   17:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:31:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:31:01 compute-0 openstack_network_exporter[198360]: ERROR   17:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:31:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:31:01 compute-0 nova_compute[186176]: 2026-02-16 17:31:01.654 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:04 compute-0 podman[209471]: 2026-02-16 17:31:04.12216389 +0000 UTC m=+0.081542049 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.7, release=1770267347, managed_by=edpm_ansible, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Feb 16 17:31:04 compute-0 ovn_controller[96437]: 2026-02-16T17:31:04Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:07:86 10.100.0.12
Feb 16 17:31:04 compute-0 ovn_controller[96437]: 2026-02-16T17:31:04Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:07:86 10.100.0.12
Feb 16 17:31:05 compute-0 nova_compute[186176]: 2026-02-16 17:31:05.636 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:06 compute-0 podman[209491]: 2026-02-16 17:31:06.096342135 +0000 UTC m=+0.057842232 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Feb 16 17:31:06 compute-0 nova_compute[186176]: 2026-02-16 17:31:06.657 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:10 compute-0 nova_compute[186176]: 2026-02-16 17:31:10.639 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:11 compute-0 nova_compute[186176]: 2026-02-16 17:31:11.658 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:14 compute-0 podman[209512]: 2026-02-16 17:31:14.102524824 +0000 UTC m=+0.069087001 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:31:14 compute-0 podman[209511]: 2026-02-16 17:31:14.143848116 +0000 UTC m=+0.112344221 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:31:15 compute-0 nova_compute[186176]: 2026-02-16 17:31:15.644 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:16 compute-0 nova_compute[186176]: 2026-02-16 17:31:16.661 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:20 compute-0 nova_compute[186176]: 2026-02-16 17:31:20.602 186180 DEBUG nova.virt.libvirt.driver [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Creating tmpfile /var/lib/nova/instances/tmph2divvbz to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 17:31:20 compute-0 nova_compute[186176]: 2026-02-16 17:31:20.604 186180 DEBUG nova.compute.manager [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph2divvbz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 17:31:20 compute-0 nova_compute[186176]: 2026-02-16 17:31:20.648 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:21 compute-0 nova_compute[186176]: 2026-02-16 17:31:21.662 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:21 compute-0 ovn_controller[96437]: 2026-02-16T17:31:21Z|00091|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 16 17:31:22 compute-0 nova_compute[186176]: 2026-02-16 17:31:22.083 186180 DEBUG nova.compute.manager [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph2divvbz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b9c530d5-4366-4ac5-b769-7ad1040af4cb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 17:31:22 compute-0 nova_compute[186176]: 2026-02-16 17:31:22.122 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-b9c530d5-4366-4ac5-b769-7ad1040af4cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:31:22 compute-0 nova_compute[186176]: 2026-02-16 17:31:22.122 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-b9c530d5-4366-4ac5-b769-7ad1040af4cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:31:22 compute-0 nova_compute[186176]: 2026-02-16 17:31:22.123 186180 DEBUG nova.network.neutron [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.347 186180 DEBUG nova.network.neutron [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Updating instance_info_cache with network_info: [{"id": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "address": "fa:16:3e:f0:ac:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45a9f62b-9b", "ovs_interfaceid": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.368 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-b9c530d5-4366-4ac5-b769-7ad1040af4cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.371 186180 DEBUG nova.virt.libvirt.driver [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph2divvbz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b9c530d5-4366-4ac5-b769-7ad1040af4cb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.372 186180 DEBUG nova.virt.libvirt.driver [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Creating instance directory: /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.373 186180 DEBUG nova.virt.libvirt.driver [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Creating disk.info with the contents: {'/var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk': 'qcow2', '/var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.374 186180 DEBUG nova.virt.libvirt.driver [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.374 186180 DEBUG nova.objects.instance [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'trusted_certs' on Instance uuid b9c530d5-4366-4ac5-b769-7ad1040af4cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.417 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.492 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.493 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.494 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.509 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.572 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.574 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.601 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.602 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.603 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.655 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.656 186180 DEBUG nova.virt.disk.api [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Checking if we can resize image /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.657 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.699 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.700 186180 DEBUG nova.virt.disk.api [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Cannot resize image /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.700 186180 DEBUG nova.objects.instance [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid b9c530d5-4366-4ac5-b769-7ad1040af4cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.749 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.769 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk.config 485376" returned: 0 in 0.020s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.772 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk.config to /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 17:31:23 compute-0 nova_compute[186176]: 2026-02-16 17:31:23.773 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk.config /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.210 186180 DEBUG oslo_concurrency.processutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk.config /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.212 186180 DEBUG nova.virt.libvirt.driver [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.213 186180 DEBUG nova.virt.libvirt.vif [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:30:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1552332012',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1552332012',id=9,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:30:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6fb6560dd6834661a01ab8901000d6ac',ramdisk_id='',reservation_id='r-nm475a1b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-810545833',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-810545833-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:30:42Z,user_data=None,user_id='468cc4ab56ec477b890d2e4cc38a2ddc',uuid=b9c530d5-4366-4ac5-b769-7ad1040af4cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "address": "fa:16:3e:f0:ac:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap45a9f62b-9b", "ovs_interfaceid": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.213 186180 DEBUG nova.network.os_vif_util [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "address": "fa:16:3e:f0:ac:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap45a9f62b-9b", "ovs_interfaceid": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.214 186180 DEBUG nova.network.os_vif_util [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45a9f62b-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.215 186180 DEBUG os_vif [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45a9f62b-9b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.215 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.216 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.216 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.219 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.220 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap45a9f62b-9b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.220 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap45a9f62b-9b, col_values=(('external_ids', {'iface-id': '45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:ac:86', 'vm-uuid': 'b9c530d5-4366-4ac5-b769-7ad1040af4cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.222 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:24 compute-0 NetworkManager[56463]: <info>  [1771263084.2244] manager: (tap45a9f62b-9b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.224 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.231 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.232 186180 INFO os_vif [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45a9f62b-9b')
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.232 186180 DEBUG nova.virt.libvirt.driver [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 17:31:24 compute-0 nova_compute[186176]: 2026-02-16 17:31:24.232 186180 DEBUG nova.compute.manager [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph2divvbz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b9c530d5-4366-4ac5-b769-7ad1040af4cb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 17:31:26 compute-0 nova_compute[186176]: 2026-02-16 17:31:26.665 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:28 compute-0 nova_compute[186176]: 2026-02-16 17:31:28.481 186180 DEBUG nova.network.neutron [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Port 45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 17:31:28 compute-0 nova_compute[186176]: 2026-02-16 17:31:28.484 186180 DEBUG nova.compute.manager [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmph2divvbz',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='b9c530d5-4366-4ac5-b769-7ad1040af4cb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 17:31:28 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 17:31:28 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 17:31:28 compute-0 kernel: tap45a9f62b-9b: entered promiscuous mode
Feb 16 17:31:28 compute-0 nova_compute[186176]: 2026-02-16 17:31:28.786 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:28 compute-0 NetworkManager[56463]: <info>  [1771263088.7877] manager: (tap45a9f62b-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Feb 16 17:31:28 compute-0 ovn_controller[96437]: 2026-02-16T17:31:28Z|00092|binding|INFO|Claiming lport 45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 for this additional chassis.
Feb 16 17:31:28 compute-0 ovn_controller[96437]: 2026-02-16T17:31:28Z|00093|binding|INFO|45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5: Claiming fa:16:3e:f0:ac:86 10.100.0.6
Feb 16 17:31:28 compute-0 nova_compute[186176]: 2026-02-16 17:31:28.788 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:28 compute-0 ovn_controller[96437]: 2026-02-16T17:31:28Z|00094|binding|INFO|Setting lport 45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 ovn-installed in OVS
Feb 16 17:31:28 compute-0 nova_compute[186176]: 2026-02-16 17:31:28.801 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:28 compute-0 systemd-machined[155631]: New machine qemu-9-instance-00000009.
Feb 16 17:31:28 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Feb 16 17:31:28 compute-0 systemd-udevd[209619]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:31:28 compute-0 NetworkManager[56463]: <info>  [1771263088.8514] device (tap45a9f62b-9b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:31:28 compute-0 NetworkManager[56463]: <info>  [1771263088.8523] device (tap45a9f62b-9b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:31:29 compute-0 nova_compute[186176]: 2026-02-16 17:31:29.222 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:29 compute-0 nova_compute[186176]: 2026-02-16 17:31:29.476 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263089.4762914, b9c530d5-4366-4ac5-b769-7ad1040af4cb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:31:29 compute-0 nova_compute[186176]: 2026-02-16 17:31:29.477 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] VM Started (Lifecycle Event)
Feb 16 17:31:29 compute-0 nova_compute[186176]: 2026-02-16 17:31:29.502 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:31:29 compute-0 podman[195505]: time="2026-02-16T17:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:31:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:31:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Feb 16 17:31:30 compute-0 nova_compute[186176]: 2026-02-16 17:31:30.053 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263090.0531728, b9c530d5-4366-4ac5-b769-7ad1040af4cb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:31:30 compute-0 nova_compute[186176]: 2026-02-16 17:31:30.053 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] VM Resumed (Lifecycle Event)
Feb 16 17:31:30 compute-0 nova_compute[186176]: 2026-02-16 17:31:30.071 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:31:30 compute-0 nova_compute[186176]: 2026-02-16 17:31:30.075 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:31:30 compute-0 nova_compute[186176]: 2026-02-16 17:31:30.095 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 17:31:31 compute-0 ovn_controller[96437]: 2026-02-16T17:31:31Z|00095|binding|INFO|Claiming lport 45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 for this chassis.
Feb 16 17:31:31 compute-0 ovn_controller[96437]: 2026-02-16T17:31:31Z|00096|binding|INFO|45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5: Claiming fa:16:3e:f0:ac:86 10.100.0.6
Feb 16 17:31:31 compute-0 ovn_controller[96437]: 2026-02-16T17:31:31Z|00097|binding|INFO|Setting lport 45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 up in Southbound
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.052 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:ac:86 10.100.0.6'], port_security=['fa:16:3e:f0:ac:86 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b9c530d5-4366-4ac5-b769-7ad1040af4cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b39b2c0-1083-4978-979e-9968d58a02ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6fb6560dd6834661a01ab8901000d6ac', 'neutron:revision_number': '11', 'neutron:security_group_ids': '1fdb7134-50ed-4080-a7be-cbd5f4bce078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6db313f-6dc7-424c-a88a-3aebdc385cca, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.056 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 in datapath 4b39b2c0-1083-4978-979e-9968d58a02ef bound to our chassis
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.058 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b39b2c0-1083-4978-979e-9968d58a02ef
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.078 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[914e0cec-2cf9-491d-a407-7491bd3198bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.108 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[36c6efb4-12e9-4bb7-ae23-ad408d83d1c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.112 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[a05e950a-05bc-4ff8-bc86-01ec8365febb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.139 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[74c3aa9a-8c9b-4db3-971a-efe81e8637c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.157 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[4fe6d244-7744-400f-89b6-8f42a80e24b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b39b2c0-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b7:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464718, 'reachable_time': 25199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209654, 'error': None, 'target': 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.180 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[1c0423bf-0cbb-4f35-ad42-d978e09dc7a8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b39b2c0-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464729, 'tstamp': 464729}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209655, 'error': None, 'target': 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b39b2c0-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464732, 'tstamp': 464732}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209655, 'error': None, 'target': 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.183 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b39b2c0-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:31 compute-0 nova_compute[186176]: 2026-02-16 17:31:31.185 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:31 compute-0 nova_compute[186176]: 2026-02-16 17:31:31.186 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.189 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b39b2c0-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.190 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.191 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b39b2c0-10, col_values=(('external_ids', {'iface-id': 'cfb769f4-e04b-42dd-bc9b-2d451cb78490'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:31.192 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:31:31 compute-0 nova_compute[186176]: 2026-02-16 17:31:31.198 186180 INFO nova.compute.manager [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Post operation of migration started
Feb 16 17:31:31 compute-0 nova_compute[186176]: 2026-02-16 17:31:31.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:31 compute-0 nova_compute[186176]: 2026-02-16 17:31:31.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 17:31:31 compute-0 nova_compute[186176]: 2026-02-16 17:31:31.341 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 17:31:31 compute-0 openstack_network_exporter[198360]: ERROR   17:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:31:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:31:31 compute-0 openstack_network_exporter[198360]: ERROR   17:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:31:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:31:31 compute-0 nova_compute[186176]: 2026-02-16 17:31:31.512 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-b9c530d5-4366-4ac5-b769-7ad1040af4cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:31:31 compute-0 nova_compute[186176]: 2026-02-16 17:31:31.512 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-b9c530d5-4366-4ac5-b769-7ad1040af4cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:31:31 compute-0 nova_compute[186176]: 2026-02-16 17:31:31.513 186180 DEBUG nova.network.neutron [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:31:31 compute-0 nova_compute[186176]: 2026-02-16 17:31:31.672 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:33 compute-0 nova_compute[186176]: 2026-02-16 17:31:33.338 186180 DEBUG nova.network.neutron [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Updating instance_info_cache with network_info: [{"id": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "address": "fa:16:3e:f0:ac:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45a9f62b-9b", "ovs_interfaceid": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:31:34 compute-0 nova_compute[186176]: 2026-02-16 17:31:34.225 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:34 compute-0 nova_compute[186176]: 2026-02-16 17:31:34.379 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-b9c530d5-4366-4ac5-b769-7ad1040af4cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:31:34 compute-0 nova_compute[186176]: 2026-02-16 17:31:34.400 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:34 compute-0 nova_compute[186176]: 2026-02-16 17:31:34.401 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:34 compute-0 nova_compute[186176]: 2026-02-16 17:31:34.402 186180 DEBUG oslo_concurrency.lockutils [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:34 compute-0 nova_compute[186176]: 2026-02-16 17:31:34.409 186180 INFO nova.virt.libvirt.driver [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 17:31:34 compute-0 virtqemud[185389]: Domain id=9 name='instance-00000009' uuid=b9c530d5-4366-4ac5-b769-7ad1040af4cb is tainted: custom-monitor
Feb 16 17:31:35 compute-0 podman[209656]: 2026-02-16 17:31:35.1512259 +0000 UTC m=+0.110230535 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, name=ubi9/ubi-minimal, version=9.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347)
Feb 16 17:31:35 compute-0 nova_compute[186176]: 2026-02-16 17:31:35.342 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:35 compute-0 nova_compute[186176]: 2026-02-16 17:31:35.343 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:31:35 compute-0 nova_compute[186176]: 2026-02-16 17:31:35.343 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:31:35 compute-0 nova_compute[186176]: 2026-02-16 17:31:35.417 186180 INFO nova.virt.libvirt.driver [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 17:31:35 compute-0 nova_compute[186176]: 2026-02-16 17:31:35.496 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:31:35 compute-0 nova_compute[186176]: 2026-02-16 17:31:35.497 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:31:35 compute-0 nova_compute[186176]: 2026-02-16 17:31:35.497 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:31:35 compute-0 nova_compute[186176]: 2026-02-16 17:31:35.497 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:31:36 compute-0 nova_compute[186176]: 2026-02-16 17:31:36.424 186180 INFO nova.virt.libvirt.driver [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 17:31:36 compute-0 nova_compute[186176]: 2026-02-16 17:31:36.430 186180 DEBUG nova.compute.manager [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:31:36 compute-0 nova_compute[186176]: 2026-02-16 17:31:36.450 186180 DEBUG nova.objects.instance [None req-c3a7ccd8-33ce-4811-a87a-940c84dab73c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 17:31:36 compute-0 nova_compute[186176]: 2026-02-16 17:31:36.669 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:37 compute-0 podman[209678]: 2026-02-16 17:31:37.114566515 +0000 UTC m=+0.079385692 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:31:37 compute-0 nova_compute[186176]: 2026-02-16 17:31:37.871 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Updating instance_info_cache with network_info: [{"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:31:37 compute-0 nova_compute[186176]: 2026-02-16 17:31:37.895 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:31:37 compute-0 nova_compute[186176]: 2026-02-16 17:31:37.895 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:31:37 compute-0 nova_compute[186176]: 2026-02-16 17:31:37.896 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:37 compute-0 nova_compute[186176]: 2026-02-16 17:31:37.896 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:31:37 compute-0 nova_compute[186176]: 2026-02-16 17:31:37.897 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:37 compute-0 nova_compute[186176]: 2026-02-16 17:31:37.897 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 17:31:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:38.163 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:38.164 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:38.165 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:38 compute-0 nova_compute[186176]: 2026-02-16 17:31:38.327 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:39 compute-0 nova_compute[186176]: 2026-02-16 17:31:39.227 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:39 compute-0 nova_compute[186176]: 2026-02-16 17:31:39.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.330 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.364 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.365 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.365 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.366 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.442 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.522 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.524 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.599 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.608 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.673 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.674 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.721 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.864 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.865 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5540MB free_disk=73.16641616821289GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.865 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.865 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.983 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.983 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance b9c530d5-4366-4ac5-b769-7ad1040af4cb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.983 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:31:40 compute-0 nova_compute[186176]: 2026-02-16 17:31:40.984 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:31:41 compute-0 nova_compute[186176]: 2026-02-16 17:31:41.083 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:31:41 compute-0 nova_compute[186176]: 2026-02-16 17:31:41.096 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:31:41 compute-0 nova_compute[186176]: 2026-02-16 17:31:41.126 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:31:41 compute-0 nova_compute[186176]: 2026-02-16 17:31:41.126 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:41 compute-0 nova_compute[186176]: 2026-02-16 17:31:41.671 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.101 186180 DEBUG oslo_concurrency.lockutils [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.102 186180 DEBUG oslo_concurrency.lockutils [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.102 186180 DEBUG oslo_concurrency.lockutils [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.102 186180 DEBUG oslo_concurrency.lockutils [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.102 186180 DEBUG oslo_concurrency.lockutils [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.104 186180 INFO nova.compute.manager [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Terminating instance
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.106 186180 DEBUG nova.compute.manager [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.113 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.113 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:42 compute-0 kernel: tap25663d0b-d0 (unregistering): left promiscuous mode
Feb 16 17:31:42 compute-0 NetworkManager[56463]: <info>  [1771263102.1334] device (tap25663d0b-d0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.137 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 ovn_controller[96437]: 2026-02-16T17:31:42Z|00098|binding|INFO|Releasing lport 25663d0b-d0ac-42cb-aa23-764bc38359f3 from this chassis (sb_readonly=0)
Feb 16 17:31:42 compute-0 ovn_controller[96437]: 2026-02-16T17:31:42Z|00099|binding|INFO|Setting lport 25663d0b-d0ac-42cb-aa23-764bc38359f3 down in Southbound
Feb 16 17:31:42 compute-0 ovn_controller[96437]: 2026-02-16T17:31:42Z|00100|binding|INFO|Removing iface tap25663d0b-d0 ovn-installed in OVS
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.142 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.147 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:07:86 10.100.0.12'], port_security=['fa:16:3e:22:07:86 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '5df43f57-f8b3-4438-8e7d-f30c4bb4ae81', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b39b2c0-1083-4978-979e-9968d58a02ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6fb6560dd6834661a01ab8901000d6ac', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1fdb7134-50ed-4080-a7be-cbd5f4bce078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6db313f-6dc7-424c-a88a-3aebdc385cca, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=25663d0b-d0ac-42cb-aa23-764bc38359f3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.149 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.153 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 25663d0b-d0ac-42cb-aa23-764bc38359f3 in datapath 4b39b2c0-1083-4978-979e-9968d58a02ef unbound from our chassis
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.155 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4b39b2c0-1083-4978-979e-9968d58a02ef
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.174 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[0efc93a7-0c51-4b7f-a333-95733164be97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:42 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Feb 16 17:31:42 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Consumed 13.160s CPU time.
Feb 16 17:31:42 compute-0 systemd-machined[155631]: Machine qemu-8-instance-0000000a terminated.
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.214 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[fad99c2c-214e-44e0-8083-5eee7ae05fb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.219 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[6af1deba-1176-4df7-8013-a9195285088f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.255 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[bb58f80c-3c82-49cd-ac34-f77f0edb6eb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.275 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e89a226d-85be-4177-a2a0-c21c9fb5d87e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4b39b2c0-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b7:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464718, 'reachable_time': 25199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209723, 'error': None, 'target': 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.297 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3b49be7c-d88f-4a75-b07b-e24e98326cc9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4b39b2c0-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464729, 'tstamp': 464729}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209724, 'error': None, 'target': 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4b39b2c0-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 464732, 'tstamp': 464732}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209724, 'error': None, 'target': 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.300 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b39b2c0-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.302 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.308 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.308 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b39b2c0-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.309 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.309 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4b39b2c0-10, col_values=(('external_ids', {'iface-id': 'cfb769f4-e04b-42dd-bc9b-2d451cb78490'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.309 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.327 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.330 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.369 186180 INFO nova.virt.libvirt.driver [-] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Instance destroyed successfully.
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.370 186180 DEBUG nova.objects.instance [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lazy-loading 'resources' on Instance uuid 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.383 186180 DEBUG nova.virt.libvirt.vif [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:30:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-924773153',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-924773153',id=10,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:30:51Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6fb6560dd6834661a01ab8901000d6ac',ramdisk_id='',reservation_id='r-npsi4q2k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-810545833',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-810545833-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:30:52Z,user_data=None,user_id='468cc4ab56ec477b890d2e4cc38a2ddc',uuid=5df43f57-f8b3-4438-8e7d-f30c4bb4ae81,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.383 186180 DEBUG nova.network.os_vif_util [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Converting VIF {"id": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "address": "fa:16:3e:22:07:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25663d0b-d0", "ovs_interfaceid": "25663d0b-d0ac-42cb-aa23-764bc38359f3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.384 186180 DEBUG nova.network.os_vif_util [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:07:86,bridge_name='br-int',has_traffic_filtering=True,id=25663d0b-d0ac-42cb-aa23-764bc38359f3,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25663d0b-d0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.384 186180 DEBUG os_vif [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:07:86,bridge_name='br-int',has_traffic_filtering=True,id=25663d0b-d0ac-42cb-aa23-764bc38359f3,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25663d0b-d0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.386 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.386 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25663d0b-d0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.394 186180 DEBUG nova.compute.manager [req-8dbe851b-b892-45b4-8fb9-8c5e3c01e0d6 req-aeff7acb-d8c6-4da4-b79d-0443933da435 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Received event network-vif-unplugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.394 186180 DEBUG oslo_concurrency.lockutils [req-8dbe851b-b892-45b4-8fb9-8c5e3c01e0d6 req-aeff7acb-d8c6-4da4-b79d-0443933da435 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.395 186180 DEBUG oslo_concurrency.lockutils [req-8dbe851b-b892-45b4-8fb9-8c5e3c01e0d6 req-aeff7acb-d8c6-4da4-b79d-0443933da435 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.395 186180 DEBUG oslo_concurrency.lockutils [req-8dbe851b-b892-45b4-8fb9-8c5e3c01e0d6 req-aeff7acb-d8c6-4da4-b79d-0443933da435 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.395 186180 DEBUG nova.compute.manager [req-8dbe851b-b892-45b4-8fb9-8c5e3c01e0d6 req-aeff7acb-d8c6-4da4-b79d-0443933da435 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] No waiting events found dispatching network-vif-unplugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.396 186180 DEBUG nova.compute.manager [req-8dbe851b-b892-45b4-8fb9-8c5e3c01e0d6 req-aeff7acb-d8c6-4da4-b79d-0443933da435 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Received event network-vif-unplugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.421 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.422 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.425 186180 INFO os_vif [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:07:86,bridge_name='br-int',has_traffic_filtering=True,id=25663d0b-d0ac-42cb-aa23-764bc38359f3,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25663d0b-d0')
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.426 186180 INFO nova.virt.libvirt.driver [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Deleting instance files /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81_del
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.426 186180 INFO nova.virt.libvirt.driver [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Deletion of /var/lib/nova/instances/5df43f57-f8b3-4438-8e7d-f30c4bb4ae81_del complete
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.481 186180 INFO nova.compute.manager [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Took 0.38 seconds to destroy the instance on the hypervisor.
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.482 186180 DEBUG oslo.service.loopingcall [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.482 186180 DEBUG nova.compute.manager [-] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.482 186180 DEBUG nova.network.neutron [-] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.938 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.939 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:42.940 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:31:42 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.966 186180 DEBUG nova.network.neutron [-] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:42.999 186180 INFO nova.compute.manager [-] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Took 0.52 seconds to deallocate network for instance.
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.049 186180 DEBUG oslo_concurrency.lockutils [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.050 186180 DEBUG oslo_concurrency.lockutils [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.114 186180 DEBUG nova.compute.provider_tree [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.137 186180 DEBUG nova.scheduler.client.report [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.159 186180 DEBUG oslo_concurrency.lockutils [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.182 186180 INFO nova.scheduler.client.report [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Deleted allocations for instance 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.257 186180 DEBUG oslo_concurrency.lockutils [None req-401a6790-e75e-450d-b6c5-3e286f49b77f 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.961 186180 DEBUG oslo_concurrency.lockutils [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.962 186180 DEBUG oslo_concurrency.lockutils [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.964 186180 DEBUG oslo_concurrency.lockutils [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.965 186180 DEBUG oslo_concurrency.lockutils [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.965 186180 DEBUG oslo_concurrency.lockutils [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.967 186180 INFO nova.compute.manager [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Terminating instance
Feb 16 17:31:43 compute-0 nova_compute[186176]: 2026-02-16 17:31:43.968 186180 DEBUG nova.compute.manager [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:31:43 compute-0 kernel: tap45a9f62b-9b (unregistering): left promiscuous mode
Feb 16 17:31:44 compute-0 NetworkManager[56463]: <info>  [1771263104.0001] device (tap45a9f62b-9b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:31:44 compute-0 ovn_controller[96437]: 2026-02-16T17:31:44Z|00101|binding|INFO|Releasing lport 45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 from this chassis (sb_readonly=0)
Feb 16 17:31:44 compute-0 ovn_controller[96437]: 2026-02-16T17:31:44Z|00102|binding|INFO|Setting lport 45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 down in Southbound
Feb 16 17:31:44 compute-0 ovn_controller[96437]: 2026-02-16T17:31:44Z|00103|binding|INFO|Removing iface tap45a9f62b-9b ovn-installed in OVS
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.046 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.052 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.073 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:ac:86 10.100.0.6'], port_security=['fa:16:3e:f0:ac:86 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b9c530d5-4366-4ac5-b769-7ad1040af4cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4b39b2c0-1083-4978-979e-9968d58a02ef', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6fb6560dd6834661a01ab8901000d6ac', 'neutron:revision_number': '13', 'neutron:security_group_ids': '1fdb7134-50ed-4080-a7be-cbd5f4bce078', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e6db313f-6dc7-424c-a88a-3aebdc385cca, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.075 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 in datapath 4b39b2c0-1083-4978-979e-9968d58a02ef unbound from our chassis
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.076 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4b39b2c0-1083-4978-979e-9968d58a02ef, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.078 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[5b76c17a-06c2-4fca-b4c9-c3e476cd1a2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.079 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef namespace which is not needed anymore
Feb 16 17:31:44 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Feb 16 17:31:44 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 1.651s CPU time.
Feb 16 17:31:44 compute-0 systemd-machined[155631]: Machine qemu-9-instance-00000009 terminated.
Feb 16 17:31:44 compute-0 NetworkManager[56463]: <info>  [1771263104.1900] manager: (tap45a9f62b-9b): new Tun device (/org/freedesktop/NetworkManager/Devices/47)
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.192 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.198 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.237 186180 INFO nova.virt.libvirt.driver [-] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Instance destroyed successfully.
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.238 186180 DEBUG nova.objects.instance [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lazy-loading 'resources' on Instance uuid b9c530d5-4366-4ac5-b769-7ad1040af4cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:31:44 compute-0 neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef[209442]: [NOTICE]   (209446) : haproxy version is 2.8.14-c23fe91
Feb 16 17:31:44 compute-0 neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef[209442]: [NOTICE]   (209446) : path to executable is /usr/sbin/haproxy
Feb 16 17:31:44 compute-0 neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef[209442]: [WARNING]  (209446) : Exiting Master process...
Feb 16 17:31:44 compute-0 neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef[209442]: [ALERT]    (209446) : Current worker (209448) exited with code 143 (Terminated)
Feb 16 17:31:44 compute-0 neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef[209442]: [WARNING]  (209446) : All workers exited. Exiting... (0)
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.264 186180 DEBUG nova.virt.libvirt.vif [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T17:30:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1552332012',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1552332012',id=9,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:30:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6fb6560dd6834661a01ab8901000d6ac',ramdisk_id='',reservation_id='r-nm475a1b',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-810545833',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-810545833-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:31:36Z,user_data=None,user_id='468cc4ab56ec477b890d2e4cc38a2ddc',uuid=b9c530d5-4366-4ac5-b769-7ad1040af4cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "address": "fa:16:3e:f0:ac:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45a9f62b-9b", "ovs_interfaceid": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:31:44 compute-0 systemd[1]: libpod-660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5.scope: Deactivated successfully.
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.265 186180 DEBUG nova.network.os_vif_util [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Converting VIF {"id": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "address": "fa:16:3e:f0:ac:86", "network": {"id": "4b39b2c0-1083-4978-979e-9968d58a02ef", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-2062764984-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6fb6560dd6834661a01ab8901000d6ac", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap45a9f62b-9b", "ovs_interfaceid": "45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.267 186180 DEBUG nova.network.os_vif_util [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45a9f62b-9b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.267 186180 DEBUG os_vif [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45a9f62b-9b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.270 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.270 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap45a9f62b-9b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:44 compute-0 podman[209768]: 2026-02-16 17:31:44.273432814 +0000 UTC m=+0.065728472 container died 660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.273 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.276 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.278 186180 INFO os_vif [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:ac:86,bridge_name='br-int',has_traffic_filtering=True,id=45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5,network=Network(4b39b2c0-1083-4978-979e-9968d58a02ef),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap45a9f62b-9b')
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.279 186180 INFO nova.virt.libvirt.driver [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Deleting instance files /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb_del
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.280 186180 INFO nova.virt.libvirt.driver [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Deletion of /var/lib/nova/instances/b9c530d5-4366-4ac5-b769-7ad1040af4cb_del complete
Feb 16 17:31:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5-userdata-shm.mount: Deactivated successfully.
Feb 16 17:31:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-aa803492633eed2947bb7a6f42389ada3e0a42ef6669c2552be4ba0f7be67fb1-merged.mount: Deactivated successfully.
Feb 16 17:31:44 compute-0 podman[209768]: 2026-02-16 17:31:44.321088999 +0000 UTC m=+0.113384627 container cleanup 660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:31:44 compute-0 systemd[1]: libpod-conmon-660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5.scope: Deactivated successfully.
Feb 16 17:31:44 compute-0 podman[209790]: 2026-02-16 17:31:44.338901718 +0000 UTC m=+0.093243837 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:31:44 compute-0 podman[209793]: 2026-02-16 17:31:44.353731549 +0000 UTC m=+0.103603390 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.369 186180 INFO nova.compute.manager [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Took 0.40 seconds to destroy the instance on the hypervisor.
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.370 186180 DEBUG oslo.service.loopingcall [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.370 186180 DEBUG nova.compute.manager [-] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.371 186180 DEBUG nova.network.neutron [-] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:31:44 compute-0 podman[209854]: 2026-02-16 17:31:44.380602917 +0000 UTC m=+0.038635989 container remove 660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.385 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e46ecfc4-17da-4e57-90bd-577679334b86]: (4, ('Mon Feb 16 05:31:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef (660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5)\n660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5\nMon Feb 16 05:31:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef (660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5)\n660b0fb3919e1e9e01ce2020c2f0a7729d7d7e03cd3d2cce6e8d9b052c3023d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.387 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e77cc052-0e8e-46e3-abb1-7bef8fd169dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.388 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b39b2c0-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:44 compute-0 kernel: tap4b39b2c0-10: left promiscuous mode
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.391 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.396 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.397 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[4528387b-890f-4070-a7b3-6e67d024d52a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.423 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[480d446f-4253-48ed-b267-0efb36f2d45e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.424 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b386352f-b13c-4823-810a-c74a81838474]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.439 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b7636ccb-2363-4855-bbd4-498fdf8b431e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 464711, 'reachable_time': 37990, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209872, 'error': None, 'target': 'ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d4b39b2c0\x2d1083\x2d4978\x2d979e\x2d9968d58a02ef.mount: Deactivated successfully.
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.442 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4b39b2c0-1083-4978-979e-9968d58a02ef deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:31:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:44.442 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[385e4257-5dba-4717-96b2-5e24975fdd21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.487 186180 DEBUG nova.compute.manager [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Received event network-vif-plugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.488 186180 DEBUG oslo_concurrency.lockutils [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.489 186180 DEBUG oslo_concurrency.lockutils [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.489 186180 DEBUG oslo_concurrency.lockutils [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5df43f57-f8b3-4438-8e7d-f30c4bb4ae81-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.490 186180 DEBUG nova.compute.manager [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] No waiting events found dispatching network-vif-plugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.493 186180 WARNING nova.compute.manager [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Received unexpected event network-vif-plugged-25663d0b-d0ac-42cb-aa23-764bc38359f3 for instance with vm_state deleted and task_state None.
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.494 186180 DEBUG nova.compute.manager [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Received event network-vif-deleted-25663d0b-d0ac-42cb-aa23-764bc38359f3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.494 186180 DEBUG nova.compute.manager [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Received event network-vif-unplugged-45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.495 186180 DEBUG oslo_concurrency.lockutils [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.495 186180 DEBUG oslo_concurrency.lockutils [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.496 186180 DEBUG oslo_concurrency.lockutils [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.496 186180 DEBUG nova.compute.manager [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] No waiting events found dispatching network-vif-unplugged-45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:31:44 compute-0 nova_compute[186176]: 2026-02-16 17:31:44.497 186180 DEBUG nova.compute.manager [req-80c3c0d6-7e2c-401e-b9da-226831559521 req-24e6c719-3437-4d05-9a1b-1cf3c818f707 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Received event network-vif-unplugged-45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:31:45 compute-0 nova_compute[186176]: 2026-02-16 17:31:45.021 186180 DEBUG nova.network.neutron [-] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:31:45 compute-0 nova_compute[186176]: 2026-02-16 17:31:45.039 186180 INFO nova.compute.manager [-] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Took 0.67 seconds to deallocate network for instance.
Feb 16 17:31:45 compute-0 nova_compute[186176]: 2026-02-16 17:31:45.078 186180 DEBUG oslo_concurrency.lockutils [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:45 compute-0 nova_compute[186176]: 2026-02-16 17:31:45.079 186180 DEBUG oslo_concurrency.lockutils [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:45 compute-0 nova_compute[186176]: 2026-02-16 17:31:45.133 186180 DEBUG nova.compute.provider_tree [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:31:45 compute-0 nova_compute[186176]: 2026-02-16 17:31:45.164 186180 DEBUG nova.scheduler.client.report [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:31:45 compute-0 nova_compute[186176]: 2026-02-16 17:31:45.187 186180 DEBUG oslo_concurrency.lockutils [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:45 compute-0 nova_compute[186176]: 2026-02-16 17:31:45.212 186180 INFO nova.scheduler.client.report [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Deleted allocations for instance b9c530d5-4366-4ac5-b769-7ad1040af4cb
Feb 16 17:31:45 compute-0 nova_compute[186176]: 2026-02-16 17:31:45.266 186180 DEBUG oslo_concurrency.lockutils [None req-64fc490b-4901-4dc4-bc40-b295019eb4ad 468cc4ab56ec477b890d2e4cc38a2ddc 6fb6560dd6834661a01ab8901000d6ac - - default default] Lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:46 compute-0 nova_compute[186176]: 2026-02-16 17:31:46.555 186180 DEBUG nova.compute.manager [req-1e698577-5446-4cb4-8d72-369e67791fd0 req-7f5d29e3-b94d-4218-81ea-3d1944f36d31 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Received event network-vif-plugged-45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:31:46 compute-0 nova_compute[186176]: 2026-02-16 17:31:46.555 186180 DEBUG oslo_concurrency.lockutils [req-1e698577-5446-4cb4-8d72-369e67791fd0 req-7f5d29e3-b94d-4218-81ea-3d1944f36d31 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:31:46 compute-0 nova_compute[186176]: 2026-02-16 17:31:46.556 186180 DEBUG oslo_concurrency.lockutils [req-1e698577-5446-4cb4-8d72-369e67791fd0 req-7f5d29e3-b94d-4218-81ea-3d1944f36d31 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:31:46 compute-0 nova_compute[186176]: 2026-02-16 17:31:46.556 186180 DEBUG oslo_concurrency.lockutils [req-1e698577-5446-4cb4-8d72-369e67791fd0 req-7f5d29e3-b94d-4218-81ea-3d1944f36d31 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b9c530d5-4366-4ac5-b769-7ad1040af4cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:31:46 compute-0 nova_compute[186176]: 2026-02-16 17:31:46.557 186180 DEBUG nova.compute.manager [req-1e698577-5446-4cb4-8d72-369e67791fd0 req-7f5d29e3-b94d-4218-81ea-3d1944f36d31 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] No waiting events found dispatching network-vif-plugged-45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:31:46 compute-0 nova_compute[186176]: 2026-02-16 17:31:46.557 186180 WARNING nova.compute.manager [req-1e698577-5446-4cb4-8d72-369e67791fd0 req-7f5d29e3-b94d-4218-81ea-3d1944f36d31 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Received unexpected event network-vif-plugged-45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 for instance with vm_state deleted and task_state None.
Feb 16 17:31:46 compute-0 nova_compute[186176]: 2026-02-16 17:31:46.558 186180 DEBUG nova.compute.manager [req-1e698577-5446-4cb4-8d72-369e67791fd0 req-7f5d29e3-b94d-4218-81ea-3d1944f36d31 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Received event network-vif-deleted-45a9f62b-9b6d-43ae-bde0-e71c5a18f7a5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:31:46 compute-0 nova_compute[186176]: 2026-02-16 17:31:46.673 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:48 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:31:48.942 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:31:49 compute-0 nova_compute[186176]: 2026-02-16 17:31:49.273 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:49 compute-0 nova_compute[186176]: 2026-02-16 17:31:49.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:31:51 compute-0 nova_compute[186176]: 2026-02-16 17:31:51.675 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:54 compute-0 nova_compute[186176]: 2026-02-16 17:31:54.275 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:56 compute-0 nova_compute[186176]: 2026-02-16 17:31:56.677 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:57 compute-0 nova_compute[186176]: 2026-02-16 17:31:57.368 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771263102.3665183, 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:31:57 compute-0 nova_compute[186176]: 2026-02-16 17:31:57.368 186180 INFO nova.compute.manager [-] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] VM Stopped (Lifecycle Event)
Feb 16 17:31:57 compute-0 nova_compute[186176]: 2026-02-16 17:31:57.388 186180 DEBUG nova.compute.manager [None req-1f350d09-75ac-4c23-af1c-a42bf6783364 - - - - - -] [instance: 5df43f57-f8b3-4438-8e7d-f30c4bb4ae81] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:31:59 compute-0 nova_compute[186176]: 2026-02-16 17:31:59.235 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771263104.2335742, b9c530d5-4366-4ac5-b769-7ad1040af4cb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:31:59 compute-0 nova_compute[186176]: 2026-02-16 17:31:59.236 186180 INFO nova.compute.manager [-] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] VM Stopped (Lifecycle Event)
Feb 16 17:31:59 compute-0 nova_compute[186176]: 2026-02-16 17:31:59.257 186180 DEBUG nova.compute.manager [None req-581b99fc-1d46-4995-85ed-88f15ac353c6 - - - - - -] [instance: b9c530d5-4366-4ac5-b769-7ad1040af4cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:31:59 compute-0 nova_compute[186176]: 2026-02-16 17:31:59.277 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:31:59 compute-0 podman[195505]: time="2026-02-16T17:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:31:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:31:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Feb 16 17:32:01 compute-0 openstack_network_exporter[198360]: ERROR   17:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:32:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:32:01 compute-0 openstack_network_exporter[198360]: ERROR   17:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:32:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:32:01 compute-0 nova_compute[186176]: 2026-02-16 17:32:01.678 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:04 compute-0 nova_compute[186176]: 2026-02-16 17:32:04.279 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:06 compute-0 podman[209873]: 2026-02-16 17:32:06.135522044 +0000 UTC m=+0.105958573 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Feb 16 17:32:06 compute-0 nova_compute[186176]: 2026-02-16 17:32:06.680 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:08 compute-0 podman[209894]: 2026-02-16 17:32:08.077945938 +0000 UTC m=+0.053989943 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 16 17:32:09 compute-0 nova_compute[186176]: 2026-02-16 17:32:09.281 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:11 compute-0 nova_compute[186176]: 2026-02-16 17:32:11.684 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:14 compute-0 nova_compute[186176]: 2026-02-16 17:32:14.307 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:15 compute-0 ovn_controller[96437]: 2026-02-16T17:32:15Z|00104|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Feb 16 17:32:15 compute-0 podman[209915]: 2026-02-16 17:32:15.124109957 +0000 UTC m=+0.079866375 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:32:15 compute-0 podman[209914]: 2026-02-16 17:32:15.142245864 +0000 UTC m=+0.104888254 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:32:16 compute-0 nova_compute[186176]: 2026-02-16 17:32:16.685 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:19 compute-0 nova_compute[186176]: 2026-02-16 17:32:19.339 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:20 compute-0 nova_compute[186176]: 2026-02-16 17:32:20.066 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:21 compute-0 nova_compute[186176]: 2026-02-16 17:32:21.686 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:24 compute-0 nova_compute[186176]: 2026-02-16 17:32:24.386 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:26 compute-0 nova_compute[186176]: 2026-02-16 17:32:26.688 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:29 compute-0 nova_compute[186176]: 2026-02-16 17:32:29.391 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:29 compute-0 podman[195505]: time="2026-02-16T17:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:32:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:32:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 17:32:31 compute-0 openstack_network_exporter[198360]: ERROR   17:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:32:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:32:31 compute-0 openstack_network_exporter[198360]: ERROR   17:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:32:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:32:31 compute-0 nova_compute[186176]: 2026-02-16 17:32:31.690 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:34 compute-0 nova_compute[186176]: 2026-02-16 17:32:34.449 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:36 compute-0 nova_compute[186176]: 2026-02-16 17:32:36.319 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:32:36 compute-0 nova_compute[186176]: 2026-02-16 17:32:36.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:32:36 compute-0 nova_compute[186176]: 2026-02-16 17:32:36.320 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:32:36 compute-0 nova_compute[186176]: 2026-02-16 17:32:36.335 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:32:36 compute-0 nova_compute[186176]: 2026-02-16 17:32:36.692 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:37 compute-0 podman[209966]: 2026-02-16 17:32:37.10621314 +0000 UTC m=+0.071407902 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, version=9.7, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, release=1770267347, io.openshift.tags=minimal rhel9, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git)
Feb 16 17:32:37 compute-0 nova_compute[186176]: 2026-02-16 17:32:37.329 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:32:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:32:38.164 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:32:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:32:38.165 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:32:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:32:38.165 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:32:38 compute-0 nova_compute[186176]: 2026-02-16 17:32:38.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:32:38 compute-0 nova_compute[186176]: 2026-02-16 17:32:38.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:32:38 compute-0 nova_compute[186176]: 2026-02-16 17:32:38.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:32:39 compute-0 podman[209988]: 2026-02-16 17:32:39.087807627 +0000 UTC m=+0.058108632 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 17:32:39 compute-0 nova_compute[186176]: 2026-02-16 17:32:39.452 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.346 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.346 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.346 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.346 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.499 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.500 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5862MB free_disk=73.22393798828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.501 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.501 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.591 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.591 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.630 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.647 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.668 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:32:40 compute-0 nova_compute[186176]: 2026-02-16 17:32:40.668 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:32:41 compute-0 nova_compute[186176]: 2026-02-16 17:32:41.694 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:42 compute-0 nova_compute[186176]: 2026-02-16 17:32:42.662 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:32:42 compute-0 nova_compute[186176]: 2026-02-16 17:32:42.663 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:32:42 compute-0 nova_compute[186176]: 2026-02-16 17:32:42.663 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:32:44 compute-0 nova_compute[186176]: 2026-02-16 17:32:44.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:32:44 compute-0 nova_compute[186176]: 2026-02-16 17:32:44.484 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:46 compute-0 podman[210010]: 2026-02-16 17:32:46.153377969 +0000 UTC m=+0.108385686 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 17:32:46 compute-0 podman[210009]: 2026-02-16 17:32:46.169108613 +0000 UTC m=+0.130836528 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 16 17:32:46 compute-0 nova_compute[186176]: 2026-02-16 17:32:46.722 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:49 compute-0 nova_compute[186176]: 2026-02-16 17:32:49.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:32:49 compute-0 nova_compute[186176]: 2026-02-16 17:32:49.488 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:50 compute-0 nova_compute[186176]: 2026-02-16 17:32:50.024 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:50 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:32:50.025 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:32:50 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:32:50.026 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:32:51 compute-0 nova_compute[186176]: 2026-02-16 17:32:51.724 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:54 compute-0 nova_compute[186176]: 2026-02-16 17:32:54.491 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:54 compute-0 ovn_controller[96437]: 2026-02-16T17:32:54Z|00105|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Feb 16 17:32:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:32:56.029 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:32:56 compute-0 nova_compute[186176]: 2026-02-16 17:32:56.725 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:59 compute-0 nova_compute[186176]: 2026-02-16 17:32:59.534 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:32:59 compute-0 podman[195505]: time="2026-02-16T17:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:32:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:32:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2171 "" "Go-http-client/1.1"
Feb 16 17:33:01 compute-0 openstack_network_exporter[198360]: ERROR   17:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:33:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:33:01 compute-0 openstack_network_exporter[198360]: ERROR   17:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:33:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:33:01 compute-0 nova_compute[186176]: 2026-02-16 17:33:01.727 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:04 compute-0 nova_compute[186176]: 2026-02-16 17:33:04.564 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:06 compute-0 nova_compute[186176]: 2026-02-16 17:33:06.729 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:08 compute-0 podman[210058]: 2026-02-16 17:33:08.159955446 +0000 UTC m=+0.097150450 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, maintainer=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public)
Feb 16 17:33:09 compute-0 nova_compute[186176]: 2026-02-16 17:33:09.566 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:10 compute-0 podman[210081]: 2026-02-16 17:33:10.103213722 +0000 UTC m=+0.070670533 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 16 17:33:11 compute-0 nova_compute[186176]: 2026-02-16 17:33:11.732 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:14 compute-0 nova_compute[186176]: 2026-02-16 17:33:14.626 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:16 compute-0 nova_compute[186176]: 2026-02-16 17:33:16.734 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:17 compute-0 podman[210103]: 2026-02-16 17:33:17.100156906 +0000 UTC m=+0.056483028 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:33:17 compute-0 podman[210102]: 2026-02-16 17:33:17.190725182 +0000 UTC m=+0.148226455 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:33:18 compute-0 sshd-session[210152]: Connection closed by 87.98.166.118 port 57595
Feb 16 17:33:18 compute-0 sshd-session[210153]: Invalid user a from 87.98.166.118 port 39743
Feb 16 17:33:18 compute-0 sshd-session[210153]: Connection closed by invalid user a 87.98.166.118 port 39743 [preauth]
Feb 16 17:33:19 compute-0 nova_compute[186176]: 2026-02-16 17:33:19.630 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:21 compute-0 nova_compute[186176]: 2026-02-16 17:33:21.736 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:24 compute-0 nova_compute[186176]: 2026-02-16 17:33:24.651 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:26 compute-0 nova_compute[186176]: 2026-02-16 17:33:26.737 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:29 compute-0 nova_compute[186176]: 2026-02-16 17:33:29.673 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:29 compute-0 podman[195505]: time="2026-02-16T17:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:33:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:33:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 17:33:30 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 17:33:31 compute-0 openstack_network_exporter[198360]: ERROR   17:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:33:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:33:31 compute-0 openstack_network_exporter[198360]: ERROR   17:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:33:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:33:31 compute-0 nova_compute[186176]: 2026-02-16 17:33:31.739 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:34 compute-0 nova_compute[186176]: 2026-02-16 17:33:34.714 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:36 compute-0 nova_compute[186176]: 2026-02-16 17:33:36.742 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:38.166 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:38.166 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:38.167 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:38 compute-0 nova_compute[186176]: 2026-02-16 17:33:38.324 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:33:38 compute-0 nova_compute[186176]: 2026-02-16 17:33:38.329 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:33:38 compute-0 nova_compute[186176]: 2026-02-16 17:33:38.330 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:33:38 compute-0 nova_compute[186176]: 2026-02-16 17:33:38.369 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:33:38 compute-0 nova_compute[186176]: 2026-02-16 17:33:38.370 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:33:39 compute-0 podman[210156]: 2026-02-16 17:33:39.099651713 +0000 UTC m=+0.065693149 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 17:33:39 compute-0 nova_compute[186176]: 2026-02-16 17:33:39.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:33:39 compute-0 nova_compute[186176]: 2026-02-16 17:33:39.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:33:39 compute-0 nova_compute[186176]: 2026-02-16 17:33:39.718 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:40 compute-0 nova_compute[186176]: 2026-02-16 17:33:40.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:33:40 compute-0 nova_compute[186176]: 2026-02-16 17:33:40.365 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:40 compute-0 nova_compute[186176]: 2026-02-16 17:33:40.366 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:40 compute-0 nova_compute[186176]: 2026-02-16 17:33:40.366 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:40 compute-0 nova_compute[186176]: 2026-02-16 17:33:40.366 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:33:40 compute-0 nova_compute[186176]: 2026-02-16 17:33:40.582 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:33:40 compute-0 nova_compute[186176]: 2026-02-16 17:33:40.584 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5865MB free_disk=73.22393798828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:33:40 compute-0 nova_compute[186176]: 2026-02-16 17:33:40.584 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:40 compute-0 nova_compute[186176]: 2026-02-16 17:33:40.584 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:41 compute-0 podman[210177]: 2026-02-16 17:33:41.107225725 +0000 UTC m=+0.074674914 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 16 17:33:41 compute-0 nova_compute[186176]: 2026-02-16 17:33:41.745 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:42 compute-0 nova_compute[186176]: 2026-02-16 17:33:42.730 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:33:42 compute-0 nova_compute[186176]: 2026-02-16 17:33:42.731 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:33:42 compute-0 nova_compute[186176]: 2026-02-16 17:33:42.819 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:33:42 compute-0 nova_compute[186176]: 2026-02-16 17:33:42.838 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:33:42 compute-0 nova_compute[186176]: 2026-02-16 17:33:42.841 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:33:42 compute-0 nova_compute[186176]: 2026-02-16 17:33:42.841 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:44 compute-0 nova_compute[186176]: 2026-02-16 17:33:44.721 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:44 compute-0 nova_compute[186176]: 2026-02-16 17:33:44.836 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:33:44 compute-0 nova_compute[186176]: 2026-02-16 17:33:44.837 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:33:44 compute-0 nova_compute[186176]: 2026-02-16 17:33:44.837 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:33:45 compute-0 nova_compute[186176]: 2026-02-16 17:33:45.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:33:46 compute-0 nova_compute[186176]: 2026-02-16 17:33:46.747 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:48 compute-0 podman[210198]: 2026-02-16 17:33:48.155519753 +0000 UTC m=+0.114527264 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:33:48 compute-0 podman[210199]: 2026-02-16 17:33:48.158508867 +0000 UTC m=+0.109939228 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.094 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.095 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.157 186180 DEBUG nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.319 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.320 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.328 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.328 186180 INFO nova.compute.claims [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.498 186180 DEBUG nova.compute.provider_tree [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.516 186180 DEBUG nova.scheduler.client.report [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.556 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.557 186180 DEBUG nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.607 186180 DEBUG nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.608 186180 DEBUG nova.network.neutron [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.637 186180 INFO nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.659 186180 DEBUG nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:33:49 compute-0 nova_compute[186176]: 2026-02-16 17:33:49.725 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.054 186180 DEBUG nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.055 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.056 186180 INFO nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Creating image(s)
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.056 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "/var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.057 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.057 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.071 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.144 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.145 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.146 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.161 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.222 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.224 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.268 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.269 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.270 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.318 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.319 186180 DEBUG nova.virt.disk.api [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Checking if we can resize image /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.319 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.387 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.388 186180 DEBUG nova.virt.disk.api [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Cannot resize image /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.388 186180 DEBUG nova.objects.instance [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'migration_context' on Instance uuid e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.411 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.412 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Ensure instance console log exists: /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.413 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.413 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.414 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:50 compute-0 nova_compute[186176]: 2026-02-16 17:33:50.518 186180 DEBUG nova.policy [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54934f49b2044289bcf127662fe114b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:33:51 compute-0 nova_compute[186176]: 2026-02-16 17:33:51.739 186180 DEBUG nova.network.neutron [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Successfully created port: 41e1a19e-a3c4-4930-8b5f-d197049955d2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:33:51 compute-0 nova_compute[186176]: 2026-02-16 17:33:51.750 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:52 compute-0 nova_compute[186176]: 2026-02-16 17:33:52.927 186180 DEBUG nova.network.neutron [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Successfully updated port: 41e1a19e-a3c4-4930-8b5f-d197049955d2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:33:52 compute-0 nova_compute[186176]: 2026-02-16 17:33:52.956 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:33:52 compute-0 nova_compute[186176]: 2026-02-16 17:33:52.957 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquired lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:33:52 compute-0 nova_compute[186176]: 2026-02-16 17:33:52.957 186180 DEBUG nova.network.neutron [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:33:53 compute-0 nova_compute[186176]: 2026-02-16 17:33:53.070 186180 DEBUG nova.compute.manager [req-62f6dbb3-dbd2-426a-b275-616645321904 req-10cc898a-2f5e-4b95-9b4c-3815591574e1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-changed-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:33:53 compute-0 nova_compute[186176]: 2026-02-16 17:33:53.071 186180 DEBUG nova.compute.manager [req-62f6dbb3-dbd2-426a-b275-616645321904 req-10cc898a-2f5e-4b95-9b4c-3815591574e1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Refreshing instance network info cache due to event network-changed-41e1a19e-a3c4-4930-8b5f-d197049955d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:33:53 compute-0 nova_compute[186176]: 2026-02-16 17:33:53.071 186180 DEBUG oslo_concurrency.lockutils [req-62f6dbb3-dbd2-426a-b275-616645321904 req-10cc898a-2f5e-4b95-9b4c-3815591574e1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:33:53 compute-0 nova_compute[186176]: 2026-02-16 17:33:53.438 186180 DEBUG nova.network.neutron [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.728 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.910 186180 DEBUG nova.network.neutron [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updating instance_info_cache with network_info: [{"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.941 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Releasing lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.942 186180 DEBUG nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Instance network_info: |[{"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.943 186180 DEBUG oslo_concurrency.lockutils [req-62f6dbb3-dbd2-426a-b275-616645321904 req-10cc898a-2f5e-4b95-9b4c-3815591574e1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.943 186180 DEBUG nova.network.neutron [req-62f6dbb3-dbd2-426a-b275-616645321904 req-10cc898a-2f5e-4b95-9b4c-3815591574e1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Refreshing network info cache for port 41e1a19e-a3c4-4930-8b5f-d197049955d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.948 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Start _get_guest_xml network_info=[{"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.955 186180 WARNING nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.960 186180 DEBUG nova.virt.libvirt.host [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.961 186180 DEBUG nova.virt.libvirt.host [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.969 186180 DEBUG nova.virt.libvirt.host [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.970 186180 DEBUG nova.virt.libvirt.host [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.972 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.973 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.973 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.974 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.974 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.974 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.975 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.975 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.976 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.976 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.976 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.977 186180 DEBUG nova.virt.hardware [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.983 186180 DEBUG nova.virt.libvirt.vif [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:33:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1142233594',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1142233594',id=11,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-grava8bh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:33:49Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=e4c1ab36-37d0-4a70-b99c-cd2bb7707c39,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.983 186180 DEBUG nova.network.os_vif_util [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.984 186180 DEBUG nova.network.os_vif_util [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:29:3f,bridge_name='br-int',has_traffic_filtering=True,id=41e1a19e-a3c4-4930-8b5f-d197049955d2,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41e1a19e-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:33:54 compute-0 nova_compute[186176]: 2026-02-16 17:33:54.986 186180 DEBUG nova.objects.instance [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'pci_devices' on Instance uuid e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.014 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <uuid>e4c1ab36-37d0-4a70-b99c-cd2bb7707c39</uuid>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <name>instance-0000000b</name>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteStrategies-server-1142233594</nova:name>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:33:54</nova:creationTime>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:33:55 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:33:55 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:33:55 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:33:55 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:33:55 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:33:55 compute-0 nova_compute[186176]:         <nova:user uuid="c54934f49b2044289bcf127662fe114b">tempest-TestExecuteStrategies-1098930400-project-member</nova:user>
Feb 16 17:33:55 compute-0 nova_compute[186176]:         <nova:project uuid="1a237c4b00c5426cb1dc6afe3c7c868c">tempest-TestExecuteStrategies-1098930400</nova:project>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:33:55 compute-0 nova_compute[186176]:         <nova:port uuid="41e1a19e-a3c4-4930-8b5f-d197049955d2">
Feb 16 17:33:55 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <system>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <entry name="serial">e4c1ab36-37d0-4a70-b99c-cd2bb7707c39</entry>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <entry name="uuid">e4c1ab36-37d0-4a70-b99c-cd2bb7707c39</entry>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     </system>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <os>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   </os>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <features>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   </features>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk.config"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:7c:29:3f"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <target dev="tap41e1a19e-a3"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/console.log" append="off"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <video>
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     </video>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:33:55 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:33:55 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:33:55 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:33:55 compute-0 nova_compute[186176]: </domain>
Feb 16 17:33:55 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.016 186180 DEBUG nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Preparing to wait for external event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.017 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.017 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.018 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.019 186180 DEBUG nova.virt.libvirt.vif [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:33:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1142233594',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1142233594',id=11,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-grava8bh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:33:49Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=e4c1ab36-37d0-4a70-b99c-cd2bb7707c39,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.020 186180 DEBUG nova.network.os_vif_util [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.021 186180 DEBUG nova.network.os_vif_util [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:29:3f,bridge_name='br-int',has_traffic_filtering=True,id=41e1a19e-a3c4-4930-8b5f-d197049955d2,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41e1a19e-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.022 186180 DEBUG os_vif [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:29:3f,bridge_name='br-int',has_traffic_filtering=True,id=41e1a19e-a3c4-4930-8b5f-d197049955d2,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41e1a19e-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.023 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.024 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.025 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.030 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.030 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41e1a19e-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.031 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap41e1a19e-a3, col_values=(('external_ids', {'iface-id': '41e1a19e-a3c4-4930-8b5f-d197049955d2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:29:3f', 'vm-uuid': 'e4c1ab36-37d0-4a70-b99c-cd2bb7707c39'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.033 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:55 compute-0 NetworkManager[56463]: <info>  [1771263235.0350] manager: (tap41e1a19e-a3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.036 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.044 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.046 186180 INFO os_vif [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:29:3f,bridge_name='br-int',has_traffic_filtering=True,id=41e1a19e-a3c4-4930-8b5f-d197049955d2,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41e1a19e-a3')
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.094 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.094 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.095 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No VIF found with MAC fa:16:3e:7c:29:3f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.095 186180 INFO nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Using config drive
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.723 186180 INFO nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Creating config drive at /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk.config
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.727 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpyja9je2d execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.850 186180 DEBUG oslo_concurrency.processutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpyja9je2d" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:33:55 compute-0 kernel: tap41e1a19e-a3: entered promiscuous mode
Feb 16 17:33:55 compute-0 NetworkManager[56463]: <info>  [1771263235.9206] manager: (tap41e1a19e-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/49)
Feb 16 17:33:55 compute-0 ovn_controller[96437]: 2026-02-16T17:33:55Z|00106|binding|INFO|Claiming lport 41e1a19e-a3c4-4930-8b5f-d197049955d2 for this chassis.
Feb 16 17:33:55 compute-0 ovn_controller[96437]: 2026-02-16T17:33:55Z|00107|binding|INFO|41e1a19e-a3c4-4930-8b5f-d197049955d2: Claiming fa:16:3e:7c:29:3f 10.100.0.4
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.921 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.927 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.931 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.937 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:55.947 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:29:3f 10.100.0.4'], port_security=['fa:16:3e:7c:29:3f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e4c1ab36-37d0-4a70-b99c-cd2bb7707c39', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=41e1a19e-a3c4-4930-8b5f-d197049955d2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:33:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:55.949 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 41e1a19e-a3c4-4930-8b5f-d197049955d2 in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 bound to our chassis
Feb 16 17:33:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:55.951 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:33:55 compute-0 systemd-machined[155631]: New machine qemu-10-instance-0000000b.
Feb 16 17:33:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:55.963 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e4ab0c-593d-4e00-b6ac-624021bf477a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:55.965 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap94cafcd0-c1 in ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:33:55 compute-0 systemd-udevd[210284]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:33:55 compute-0 ovn_controller[96437]: 2026-02-16T17:33:55Z|00108|binding|INFO|Setting lport 41e1a19e-a3c4-4930-8b5f-d197049955d2 ovn-installed in OVS
Feb 16 17:33:55 compute-0 ovn_controller[96437]: 2026-02-16T17:33:55Z|00109|binding|INFO|Setting lport 41e1a19e-a3c4-4930-8b5f-d197049955d2 up in Southbound
Feb 16 17:33:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:55.967 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap94cafcd0-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:33:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:55.967 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b453b106-6a2e-4425-bc16-c54a4de2b285]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:55.969 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[111661b3-12ee-4827-a56f-2b03f7dfd673]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:55 compute-0 nova_compute[186176]: 2026-02-16 17:33:55.971 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:55 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000b.
Feb 16 17:33:55 compute-0 NetworkManager[56463]: <info>  [1771263235.9887] device (tap41e1a19e-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:33:55 compute-0 NetworkManager[56463]: <info>  [1771263235.9896] device (tap41e1a19e-a3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:33:55 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:55.987 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb8960c-0ce3-48af-968e-d46d423e392b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.008 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[36dd7c00-70cd-4638-98dd-4a3e3123c12b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.049 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b99728-a48d-4f91-b55d-577acf833e40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 NetworkManager[56463]: <info>  [1771263236.0574] manager: (tap94cafcd0-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/50)
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.056 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[8c452c23-b0c1-4036-b64f-ad5c27e8f0dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 systemd-udevd[210287]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.094 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[cf636658-b071-4db7-a6cd-2ee4141c3c45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.099 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[2d693eb7-509c-4c46-9cfa-541933b09569]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 NetworkManager[56463]: <info>  [1771263236.1370] device (tap94cafcd0-c0): carrier: link connected
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.146 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf44e1f-e30a-4512-9dc1-71dc9ec1d469]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.170 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[59d28e40-5b64-49aa-8a21-a01978ce31f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 483166, 'reachable_time': 19315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210316, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.184 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[92a85bf5-c74b-479d-8e62-b9958913fe08]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 483166, 'tstamp': 483166}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210317, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.201 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[2248de43-cd45-4861-b206-8732402b7340]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 483166, 'reachable_time': 19315, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210318, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.231 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[8b63f979-0e8a-442b-97a0-47ef58b73acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.306 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ade42f90-9ab1-4e2c-8867-a4da67919fd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.308 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.308 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.309 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:33:56 compute-0 NetworkManager[56463]: <info>  [1771263236.3123] manager: (tap94cafcd0-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Feb 16 17:33:56 compute-0 kernel: tap94cafcd0-c0: entered promiscuous mode
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.312 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.316 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:33:56 compute-0 ovn_controller[96437]: 2026-02-16T17:33:56Z|00110|binding|INFO|Releasing lport 5c28d585-b48c-40c6-b5e7-f1e59317b2de from this chassis (sb_readonly=0)
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.319 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.319 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.322 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.322 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[95cf6d3a-b90b-4334-a130-6ec09faf70fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.323 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:33:56 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:33:56.325 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'env', 'PROCESS_TAG=haproxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.401 186180 DEBUG nova.compute.manager [req-710e14c3-5fde-422d-92f9-77145ae474ee req-06aa3dc7-5969-4745-bfb2-67be8fa8f3f9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.402 186180 DEBUG oslo_concurrency.lockutils [req-710e14c3-5fde-422d-92f9-77145ae474ee req-06aa3dc7-5969-4745-bfb2-67be8fa8f3f9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.402 186180 DEBUG oslo_concurrency.lockutils [req-710e14c3-5fde-422d-92f9-77145ae474ee req-06aa3dc7-5969-4745-bfb2-67be8fa8f3f9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.403 186180 DEBUG oslo_concurrency.lockutils [req-710e14c3-5fde-422d-92f9-77145ae474ee req-06aa3dc7-5969-4745-bfb2-67be8fa8f3f9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.403 186180 DEBUG nova.compute.manager [req-710e14c3-5fde-422d-92f9-77145ae474ee req-06aa3dc7-5969-4745-bfb2-67be8fa8f3f9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Processing event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:33:56 compute-0 podman[210352]: 2026-02-16 17:33:56.685607414 +0000 UTC m=+0.062368485 container create 4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.700 186180 DEBUG nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.703 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263236.6995785, e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.703 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] VM Started (Lifecycle Event)
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.711 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.717 186180 INFO nova.virt.libvirt.driver [-] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Instance spawned successfully.
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.718 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.730 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.733 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:33:56 compute-0 systemd[1]: Started libpod-conmon-4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe.scope.
Feb 16 17:33:56 compute-0 podman[210352]: 2026-02-16 17:33:56.650967265 +0000 UTC m=+0.027728296 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.752 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.752 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.753 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.753 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.754 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.754 186180 DEBUG nova.virt.libvirt.driver [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.763 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.766 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.766 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263236.6998558, e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.767 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] VM Paused (Lifecycle Event)
Feb 16 17:33:56 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:33:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f500baf3dfc955e10578b6e555396516946e49728e79f047618a41a60145da55/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:33:56 compute-0 podman[210352]: 2026-02-16 17:33:56.787465849 +0000 UTC m=+0.164226900 container init 4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:33:56 compute-0 podman[210352]: 2026-02-16 17:33:56.792408813 +0000 UTC m=+0.169169844 container start 4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:33:56 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[210372]: [NOTICE]   (210376) : New worker (210378) forked
Feb 16 17:33:56 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[210372]: [NOTICE]   (210376) : Loading success.
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.818 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.821 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263236.7094526, e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.822 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] VM Resumed (Lifecycle Event)
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.831 186180 INFO nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Took 6.78 seconds to spawn the instance on the hypervisor.
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.831 186180 DEBUG nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.842 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.845 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.879 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.902 186180 INFO nova.compute.manager [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Took 7.67 seconds to build instance.
Feb 16 17:33:56 compute-0 nova_compute[186176]: 2026-02-16 17:33:56.929 186180 DEBUG oslo_concurrency.lockutils [None req-aede440e-5c77-44a0-86ba-9e7d62c2ed0f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.834s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:57 compute-0 nova_compute[186176]: 2026-02-16 17:33:57.180 186180 DEBUG nova.network.neutron [req-62f6dbb3-dbd2-426a-b275-616645321904 req-10cc898a-2f5e-4b95-9b4c-3815591574e1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updated VIF entry in instance network info cache for port 41e1a19e-a3c4-4930-8b5f-d197049955d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:33:57 compute-0 nova_compute[186176]: 2026-02-16 17:33:57.181 186180 DEBUG nova.network.neutron [req-62f6dbb3-dbd2-426a-b275-616645321904 req-10cc898a-2f5e-4b95-9b4c-3815591574e1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updating instance_info_cache with network_info: [{"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:33:57 compute-0 nova_compute[186176]: 2026-02-16 17:33:57.220 186180 DEBUG oslo_concurrency.lockutils [req-62f6dbb3-dbd2-426a-b275-616645321904 req-10cc898a-2f5e-4b95-9b4c-3815591574e1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:33:58 compute-0 nova_compute[186176]: 2026-02-16 17:33:58.778 186180 DEBUG nova.compute.manager [req-07a8282c-d685-453e-a394-2895d19b25a8 req-d62ef938-1f31-459a-8191-3899440a51c6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:33:58 compute-0 nova_compute[186176]: 2026-02-16 17:33:58.779 186180 DEBUG oslo_concurrency.lockutils [req-07a8282c-d685-453e-a394-2895d19b25a8 req-d62ef938-1f31-459a-8191-3899440a51c6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:33:58 compute-0 nova_compute[186176]: 2026-02-16 17:33:58.780 186180 DEBUG oslo_concurrency.lockutils [req-07a8282c-d685-453e-a394-2895d19b25a8 req-d62ef938-1f31-459a-8191-3899440a51c6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:33:58 compute-0 nova_compute[186176]: 2026-02-16 17:33:58.780 186180 DEBUG oslo_concurrency.lockutils [req-07a8282c-d685-453e-a394-2895d19b25a8 req-d62ef938-1f31-459a-8191-3899440a51c6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:33:58 compute-0 nova_compute[186176]: 2026-02-16 17:33:58.781 186180 DEBUG nova.compute.manager [req-07a8282c-d685-453e-a394-2895d19b25a8 req-d62ef938-1f31-459a-8191-3899440a51c6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] No waiting events found dispatching network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:33:58 compute-0 nova_compute[186176]: 2026-02-16 17:33:58.781 186180 WARNING nova.compute.manager [req-07a8282c-d685-453e-a394-2895d19b25a8 req-d62ef938-1f31-459a-8191-3899440a51c6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received unexpected event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 for instance with vm_state active and task_state None.
Feb 16 17:33:59 compute-0 podman[195505]: time="2026-02-16T17:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:33:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:33:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2623 "" "Go-http-client/1.1"
Feb 16 17:34:00 compute-0 nova_compute[186176]: 2026-02-16 17:34:00.034 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:01 compute-0 openstack_network_exporter[198360]: ERROR   17:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:34:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:34:01 compute-0 openstack_network_exporter[198360]: ERROR   17:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:34:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:34:01 compute-0 anacron[39749]: Job `cron.weekly' started
Feb 16 17:34:01 compute-0 anacron[39749]: Job `cron.weekly' terminated
Feb 16 17:34:01 compute-0 nova_compute[186176]: 2026-02-16 17:34:01.754 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:05 compute-0 nova_compute[186176]: 2026-02-16 17:34:05.036 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:06 compute-0 nova_compute[186176]: 2026-02-16 17:34:06.756 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:08 compute-0 ovn_controller[96437]: 2026-02-16T17:34:08Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:29:3f 10.100.0.4
Feb 16 17:34:08 compute-0 ovn_controller[96437]: 2026-02-16T17:34:08Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:29:3f 10.100.0.4
Feb 16 17:34:10 compute-0 nova_compute[186176]: 2026-02-16 17:34:10.038 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:10 compute-0 podman[210398]: 2026-02-16 17:34:10.112909193 +0000 UTC m=+0.080260233 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, version=9.7, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=)
Feb 16 17:34:11 compute-0 nova_compute[186176]: 2026-02-16 17:34:11.720 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:34:11.720 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:34:11 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:34:11.721 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:34:11 compute-0 nova_compute[186176]: 2026-02-16 17:34:11.757 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:12 compute-0 podman[210421]: 2026-02-16 17:34:12.093295554 +0000 UTC m=+0.062126469 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 17:34:15 compute-0 nova_compute[186176]: 2026-02-16 17:34:15.040 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:16 compute-0 nova_compute[186176]: 2026-02-16 17:34:16.760 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:34:18.726 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:34:19 compute-0 podman[210443]: 2026-02-16 17:34:19.090191673 +0000 UTC m=+0.056275463 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:34:19 compute-0 podman[210442]: 2026-02-16 17:34:19.114433941 +0000 UTC m=+0.087171277 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller)
Feb 16 17:34:20 compute-0 nova_compute[186176]: 2026-02-16 17:34:20.042 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:21 compute-0 nova_compute[186176]: 2026-02-16 17:34:21.763 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:25 compute-0 nova_compute[186176]: 2026-02-16 17:34:25.045 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:26 compute-0 nova_compute[186176]: 2026-02-16 17:34:26.802 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:29 compute-0 podman[195505]: time="2026-02-16T17:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:34:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:34:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 16 17:34:30 compute-0 nova_compute[186176]: 2026-02-16 17:34:30.049 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:31 compute-0 openstack_network_exporter[198360]: ERROR   17:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:34:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:34:31 compute-0 openstack_network_exporter[198360]: ERROR   17:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:34:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:34:31 compute-0 nova_compute[186176]: 2026-02-16 17:34:31.804 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:35 compute-0 nova_compute[186176]: 2026-02-16 17:34:35.051 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:36 compute-0 nova_compute[186176]: 2026-02-16 17:34:36.806 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:34:38.167 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:34:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:34:38.168 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:34:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:34:38.168 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:34:38 compute-0 nova_compute[186176]: 2026-02-16 17:34:38.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:34:38 compute-0 nova_compute[186176]: 2026-02-16 17:34:38.320 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:34:38 compute-0 nova_compute[186176]: 2026-02-16 17:34:38.320 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:34:38 compute-0 nova_compute[186176]: 2026-02-16 17:34:38.657 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:34:38 compute-0 nova_compute[186176]: 2026-02-16 17:34:38.658 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:34:38 compute-0 nova_compute[186176]: 2026-02-16 17:34:38.658 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:34:38 compute-0 nova_compute[186176]: 2026-02-16 17:34:38.659 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:34:40 compute-0 nova_compute[186176]: 2026-02-16 17:34:40.052 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:41 compute-0 podman[210495]: 2026-02-16 17:34:41.104176202 +0000 UTC m=+0.069265278 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2026-02-05T04:57:10Z)
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.693 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updating instance_info_cache with network_info: [{"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.721 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.722 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.723 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.723 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.724 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.724 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.755 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.755 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.755 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.755 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:34:41 compute-0 ovn_controller[96437]: 2026-02-16T17:34:41Z|00111|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.809 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.847 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.900 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.901 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:34:41 compute-0 nova_compute[186176]: 2026-02-16 17:34:41.963 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.111 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.112 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5675MB free_disk=73.19454574584961GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.112 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.113 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.202 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.203 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.203 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.218 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.236 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.236 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.248 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.271 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.313 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.330 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.353 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:34:42 compute-0 nova_compute[186176]: 2026-02-16 17:34:42.354 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:34:43 compute-0 podman[210523]: 2026-02-16 17:34:43.091168487 +0000 UTC m=+0.058201461 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 16 17:34:43 compute-0 nova_compute[186176]: 2026-02-16 17:34:43.947 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:34:43 compute-0 nova_compute[186176]: 2026-02-16 17:34:43.947 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:34:44 compute-0 nova_compute[186176]: 2026-02-16 17:34:44.019 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:34:44 compute-0 nova_compute[186176]: 2026-02-16 17:34:44.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:34:45 compute-0 nova_compute[186176]: 2026-02-16 17:34:45.054 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:46 compute-0 nova_compute[186176]: 2026-02-16 17:34:46.894 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:47 compute-0 nova_compute[186176]: 2026-02-16 17:34:47.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:34:50 compute-0 nova_compute[186176]: 2026-02-16 17:34:50.056 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:50 compute-0 podman[210543]: 2026-02-16 17:34:50.112946649 +0000 UTC m=+0.075192267 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:34:50 compute-0 podman[210542]: 2026-02-16 17:34:50.112411676 +0000 UTC m=+0.085397733 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 17:34:50 compute-0 nova_compute[186176]: 2026-02-16 17:34:50.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:34:51 compute-0 nova_compute[186176]: 2026-02-16 17:34:51.932 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:55 compute-0 nova_compute[186176]: 2026-02-16 17:34:55.105 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:56 compute-0 nova_compute[186176]: 2026-02-16 17:34:56.933 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:34:59 compute-0 podman[195505]: time="2026-02-16T17:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:34:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:34:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Feb 16 17:35:00 compute-0 nova_compute[186176]: 2026-02-16 17:35:00.107 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:01 compute-0 openstack_network_exporter[198360]: ERROR   17:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:35:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:35:01 compute-0 openstack_network_exporter[198360]: ERROR   17:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:35:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:35:01 compute-0 nova_compute[186176]: 2026-02-16 17:35:01.963 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:05 compute-0 nova_compute[186176]: 2026-02-16 17:35:05.109 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:06 compute-0 nova_compute[186176]: 2026-02-16 17:35:06.965 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:10 compute-0 nova_compute[186176]: 2026-02-16 17:35:10.112 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:11 compute-0 nova_compute[186176]: 2026-02-16 17:35:11.967 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:12 compute-0 podman[210591]: 2026-02-16 17:35:12.092890453 +0000 UTC m=+0.065267878 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, distribution-scope=public, release=1770267347, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, version=9.7, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Feb 16 17:35:14 compute-0 podman[210613]: 2026-02-16 17:35:14.10939816 +0000 UTC m=+0.079188207 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 16 17:35:15 compute-0 nova_compute[186176]: 2026-02-16 17:35:15.115 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:16 compute-0 nova_compute[186176]: 2026-02-16 17:35:16.971 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:20 compute-0 nova_compute[186176]: 2026-02-16 17:35:20.116 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:21 compute-0 podman[210634]: 2026-02-16 17:35:21.083617541 +0000 UTC m=+0.051495162 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:35:21 compute-0 podman[210633]: 2026-02-16 17:35:21.164875449 +0000 UTC m=+0.132023922 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 16 17:35:21 compute-0 nova_compute[186176]: 2026-02-16 17:35:21.972 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:25 compute-0 nova_compute[186176]: 2026-02-16 17:35:25.118 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:26 compute-0 nova_compute[186176]: 2026-02-16 17:35:26.268 186180 DEBUG nova.compute.manager [None req-9986fb8e-5877-4039-a06f-b1d564a24747 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider bb904aac-529f-46ef-9861-9c655a4b383c in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 16 17:35:26 compute-0 nova_compute[186176]: 2026-02-16 17:35:26.333 186180 DEBUG nova.compute.provider_tree [None req-9986fb8e-5877-4039-a06f-b1d564a24747 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Updating resource provider bb904aac-529f-46ef-9861-9c655a4b383c generation from 15 to 16 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 17:35:26 compute-0 nova_compute[186176]: 2026-02-16 17:35:26.972 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:29 compute-0 podman[195505]: time="2026-02-16T17:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:35:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:35:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 16 17:35:30 compute-0 nova_compute[186176]: 2026-02-16 17:35:30.120 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:31 compute-0 openstack_network_exporter[198360]: ERROR   17:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:35:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:35:31 compute-0 openstack_network_exporter[198360]: ERROR   17:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:35:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:35:31 compute-0 nova_compute[186176]: 2026-02-16 17:35:31.974 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:32 compute-0 nova_compute[186176]: 2026-02-16 17:35:32.712 186180 DEBUG nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Check if temp file /var/lib/nova/instances/tmpxfmmdll2 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 17:35:32 compute-0 nova_compute[186176]: 2026-02-16 17:35:32.713 186180 DEBUG nova.compute.manager [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxfmmdll2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e4c1ab36-37d0-4a70-b99c-cd2bb7707c39',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 17:35:34 compute-0 nova_compute[186176]: 2026-02-16 17:35:34.502 186180 DEBUG oslo_concurrency.processutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:35:34 compute-0 nova_compute[186176]: 2026-02-16 17:35:34.579 186180 DEBUG oslo_concurrency.processutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:35:34 compute-0 nova_compute[186176]: 2026-02-16 17:35:34.580 186180 DEBUG oslo_concurrency.processutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:35:34 compute-0 nova_compute[186176]: 2026-02-16 17:35:34.640 186180 DEBUG oslo_concurrency.processutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:35:35 compute-0 nova_compute[186176]: 2026-02-16 17:35:35.123 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:36 compute-0 nova_compute[186176]: 2026-02-16 17:35:36.977 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:38.168 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:38.168 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:38.169 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:40 compute-0 nova_compute[186176]: 2026-02-16 17:35:40.126 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:40 compute-0 nova_compute[186176]: 2026-02-16 17:35:40.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:35:40 compute-0 nova_compute[186176]: 2026-02-16 17:35:40.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:35:40 compute-0 nova_compute[186176]: 2026-02-16 17:35:40.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:35:40 compute-0 nova_compute[186176]: 2026-02-16 17:35:40.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:35:40 compute-0 nova_compute[186176]: 2026-02-16 17:35:40.343 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:35:40 compute-0 nova_compute[186176]: 2026-02-16 17:35:40.343 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:35:40 compute-0 nova_compute[186176]: 2026-02-16 17:35:40.344 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:35:40 compute-0 sshd-session[210687]: Accepted publickey for nova from 192.168.122.101 port 59320 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:35:40 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 17:35:40 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 17:35:40 compute-0 systemd-logind[821]: New session 33 of user nova.
Feb 16 17:35:40 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 17:35:40 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 17:35:40 compute-0 systemd[210691]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:35:40 compute-0 systemd[210691]: Queued start job for default target Main User Target.
Feb 16 17:35:40 compute-0 systemd[210691]: Created slice User Application Slice.
Feb 16 17:35:40 compute-0 systemd[210691]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:35:40 compute-0 systemd[210691]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 17:35:40 compute-0 systemd[210691]: Reached target Paths.
Feb 16 17:35:40 compute-0 systemd[210691]: Reached target Timers.
Feb 16 17:35:40 compute-0 systemd[210691]: Starting D-Bus User Message Bus Socket...
Feb 16 17:35:40 compute-0 systemd[210691]: Starting Create User's Volatile Files and Directories...
Feb 16 17:35:40 compute-0 systemd[210691]: Listening on D-Bus User Message Bus Socket.
Feb 16 17:35:40 compute-0 systemd[210691]: Finished Create User's Volatile Files and Directories.
Feb 16 17:35:40 compute-0 systemd[210691]: Reached target Sockets.
Feb 16 17:35:40 compute-0 systemd[210691]: Reached target Basic System.
Feb 16 17:35:40 compute-0 systemd[210691]: Reached target Main User Target.
Feb 16 17:35:40 compute-0 systemd[210691]: Startup finished in 118ms.
Feb 16 17:35:40 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 17:35:40 compute-0 systemd[1]: Started Session 33 of User nova.
Feb 16 17:35:40 compute-0 sshd-session[210687]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:35:40 compute-0 sshd-session[210707]: Received disconnect from 192.168.122.101 port 59320:11: disconnected by user
Feb 16 17:35:40 compute-0 sshd-session[210707]: Disconnected from user nova 192.168.122.101 port 59320
Feb 16 17:35:40 compute-0 sshd-session[210687]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:35:40 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Feb 16 17:35:40 compute-0 systemd-logind[821]: Session 33 logged out. Waiting for processes to exit.
Feb 16 17:35:40 compute-0 systemd-logind[821]: Removed session 33.
Feb 16 17:35:41 compute-0 nova_compute[186176]: 2026-02-16 17:35:41.640 186180 DEBUG nova.compute.manager [req-c4b6d337-57e2-4955-8870-130d7d3129a5 req-22a45dde-0cd2-48bf-88a6-492a47eea1ea 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-unplugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:35:41 compute-0 nova_compute[186176]: 2026-02-16 17:35:41.643 186180 DEBUG oslo_concurrency.lockutils [req-c4b6d337-57e2-4955-8870-130d7d3129a5 req-22a45dde-0cd2-48bf-88a6-492a47eea1ea 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:41 compute-0 nova_compute[186176]: 2026-02-16 17:35:41.644 186180 DEBUG oslo_concurrency.lockutils [req-c4b6d337-57e2-4955-8870-130d7d3129a5 req-22a45dde-0cd2-48bf-88a6-492a47eea1ea 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:41 compute-0 nova_compute[186176]: 2026-02-16 17:35:41.644 186180 DEBUG oslo_concurrency.lockutils [req-c4b6d337-57e2-4955-8870-130d7d3129a5 req-22a45dde-0cd2-48bf-88a6-492a47eea1ea 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:41 compute-0 nova_compute[186176]: 2026-02-16 17:35:41.645 186180 DEBUG nova.compute.manager [req-c4b6d337-57e2-4955-8870-130d7d3129a5 req-22a45dde-0cd2-48bf-88a6-492a47eea1ea 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] No waiting events found dispatching network-vif-unplugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:35:41 compute-0 nova_compute[186176]: 2026-02-16 17:35:41.645 186180 DEBUG nova.compute.manager [req-c4b6d337-57e2-4955-8870-130d7d3129a5 req-22a45dde-0cd2-48bf-88a6-492a47eea1ea 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-unplugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:35:41 compute-0 nova_compute[186176]: 2026-02-16 17:35:41.799 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:41.799 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:35:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:41.802 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:35:41 compute-0 nova_compute[186176]: 2026-02-16 17:35:41.978 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.281 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updating instance_info_cache with network_info: [{"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.307 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.307 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.308 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.340 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.341 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.341 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.341 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.606 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.682 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.683 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.734 186180 INFO nova.compute.manager [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Took 8.09 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.735 186180 DEBUG nova.compute.manager [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.739 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.757 186180 DEBUG nova.compute.manager [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxfmmdll2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e4c1ab36-37d0-4a70-b99c-cd2bb7707c39',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(65609735-7a1b-400f-9b0f-ea334c772b0d),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.793 186180 DEBUG nova.objects.instance [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.796 186180 DEBUG nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.799 186180 DEBUG nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.799 186180 DEBUG nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.843 186180 DEBUG nova.virt.libvirt.vif [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:33:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1142233594',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1142233594',id=11,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:33:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-grava8bh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:33:56Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=e4c1ab36-37d0-4a70-b99c-cd2bb7707c39,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.844 186180 DEBUG nova.network.os_vif_util [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.845 186180 DEBUG nova.network.os_vif_util [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:29:3f,bridge_name='br-int',has_traffic_filtering=True,id=41e1a19e-a3c4-4930-8b5f-d197049955d2,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41e1a19e-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.845 186180 DEBUG nova.virt.libvirt.migration [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 17:35:42 compute-0 nova_compute[186176]:   <mac address="fa:16:3e:7c:29:3f"/>
Feb 16 17:35:42 compute-0 nova_compute[186176]:   <model type="virtio"/>
Feb 16 17:35:42 compute-0 nova_compute[186176]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:35:42 compute-0 nova_compute[186176]:   <mtu size="1442"/>
Feb 16 17:35:42 compute-0 nova_compute[186176]:   <target dev="tap41e1a19e-a3"/>
Feb 16 17:35:42 compute-0 nova_compute[186176]: </interface>
Feb 16 17:35:42 compute-0 nova_compute[186176]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.846 186180 DEBUG nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.968 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.970 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5700MB free_disk=73.19449234008789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.970 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:42 compute-0 nova_compute[186176]: 2026-02-16 17:35:42.970 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.042 186180 INFO nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updating resource usage from migration 65609735-7a1b-400f-9b0f-ea334c772b0d
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.090 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Migration 65609735-7a1b-400f-9b0f-ea334c772b0d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.090 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.090 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:35:43 compute-0 podman[210715]: 2026-02-16 17:35:43.092215194 +0000 UTC m=+0.063574761 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1770267347, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter)
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.158 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.178 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.246 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.246 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.303 186180 DEBUG nova.virt.libvirt.migration [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.304 186180 INFO nova.virt.libvirt.migration [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.367 186180 INFO nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.755 186180 DEBUG nova.compute.manager [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.756 186180 DEBUG oslo_concurrency.lockutils [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.757 186180 DEBUG oslo_concurrency.lockutils [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.757 186180 DEBUG oslo_concurrency.lockutils [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.757 186180 DEBUG nova.compute.manager [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] No waiting events found dispatching network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.758 186180 WARNING nova.compute.manager [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received unexpected event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 for instance with vm_state active and task_state migrating.
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.758 186180 DEBUG nova.compute.manager [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-changed-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.758 186180 DEBUG nova.compute.manager [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Refreshing instance network info cache due to event network-changed-41e1a19e-a3c4-4930-8b5f-d197049955d2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.758 186180 DEBUG oslo_concurrency.lockutils [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.759 186180 DEBUG oslo_concurrency.lockutils [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.759 186180 DEBUG nova.network.neutron [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Refreshing network info cache for port 41e1a19e-a3c4-4930-8b5f-d197049955d2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.869 186180 DEBUG nova.virt.libvirt.migration [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:35:43 compute-0 nova_compute[186176]: 2026-02-16 17:35:43.869 186180 DEBUG nova.virt.libvirt.migration [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.247 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.248 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.373 186180 DEBUG nova.virt.libvirt.migration [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.374 186180 DEBUG nova.virt.libvirt.migration [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.488 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263344.487876, e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.488 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] VM Paused (Lifecycle Event)
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.517 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.522 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.544 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 17:35:44 compute-0 kernel: tap41e1a19e-a3 (unregistering): left promiscuous mode
Feb 16 17:35:44 compute-0 NetworkManager[56463]: <info>  [1771263344.6068] device (tap41e1a19e-a3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:35:44 compute-0 ovn_controller[96437]: 2026-02-16T17:35:44Z|00112|binding|INFO|Releasing lport 41e1a19e-a3c4-4930-8b5f-d197049955d2 from this chassis (sb_readonly=0)
Feb 16 17:35:44 compute-0 ovn_controller[96437]: 2026-02-16T17:35:44Z|00113|binding|INFO|Setting lport 41e1a19e-a3c4-4930-8b5f-d197049955d2 down in Southbound
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.611 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:44 compute-0 ovn_controller[96437]: 2026-02-16T17:35:44Z|00114|binding|INFO|Removing iface tap41e1a19e-a3 ovn-installed in OVS
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.620 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:29:3f 10.100.0.4'], port_security=['fa:16:3e:7c:29:3f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'e4c1ab36-37d0-4a70-b99c-cd2bb7707c39', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=41e1a19e-a3c4-4930-8b5f-d197049955d2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.622 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 41e1a19e-a3c4-4930-8b5f-d197049955d2 in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.623 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.624 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[14a70fab-bab4-47b8-84fe-d893bcba61f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.624 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.625 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace which is not needed anymore
Feb 16 17:35:44 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Feb 16 17:35:44 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000b.scope: Consumed 17.078s CPU time.
Feb 16 17:35:44 compute-0 systemd-machined[155631]: Machine qemu-10-instance-0000000b terminated.
Feb 16 17:35:44 compute-0 podman[210758]: 2026-02-16 17:35:44.742383355 +0000 UTC m=+0.109501412 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 16 17:35:44 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[210372]: [NOTICE]   (210376) : haproxy version is 2.8.14-c23fe91
Feb 16 17:35:44 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[210372]: [NOTICE]   (210376) : path to executable is /usr/sbin/haproxy
Feb 16 17:35:44 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[210372]: [WARNING]  (210376) : Exiting Master process...
Feb 16 17:35:44 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[210372]: [WARNING]  (210376) : Exiting Master process...
Feb 16 17:35:44 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[210372]: [ALERT]    (210376) : Current worker (210378) exited with code 143 (Terminated)
Feb 16 17:35:44 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[210372]: [WARNING]  (210376) : All workers exited. Exiting... (0)
Feb 16 17:35:44 compute-0 systemd[1]: libpod-4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe.scope: Deactivated successfully.
Feb 16 17:35:44 compute-0 podman[210799]: 2026-02-16 17:35:44.783186879 +0000 UTC m=+0.056770982 container died 4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:35:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe-userdata-shm.mount: Deactivated successfully.
Feb 16 17:35:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-f500baf3dfc955e10578b6e555396516946e49728e79f047618a41a60145da55-merged.mount: Deactivated successfully.
Feb 16 17:35:44 compute-0 podman[210799]: 2026-02-16 17:35:44.832033083 +0000 UTC m=+0.105617176 container cleanup 4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.841 186180 DEBUG nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.842 186180 DEBUG nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.842 186180 DEBUG nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 17:35:44 compute-0 systemd[1]: libpod-conmon-4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe.scope: Deactivated successfully.
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.876 186180 DEBUG nova.virt.libvirt.guest [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'e4c1ab36-37d0-4a70-b99c-cd2bb7707c39' (instance-0000000b) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.878 186180 INFO nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Migration operation has completed
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.879 186180 INFO nova.compute.manager [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] _post_live_migration() is started..
Feb 16 17:35:44 compute-0 podman[210845]: 2026-02-16 17:35:44.91437407 +0000 UTC m=+0.053892181 container remove 4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.919 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f0268689-62de-4b56-a6d4-451574a5d9cd]: (4, ('Mon Feb 16 05:35:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe)\n4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe\nMon Feb 16 05:35:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe)\n4db33b758abacf3d0756b939ca544a0a1c7ef64b41722bc93d1863b82f780dfe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.921 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[81b275ec-70cb-4566-8021-decd11b57002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.923 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.925 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:44 compute-0 kernel: tap94cafcd0-c0: left promiscuous mode
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.929 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.935 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.939 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b6717929-cbb1-42c2-bef9-83b40e549479]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.954 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[2561bdf5-ae79-4851-bf2e-07fa94ee9db9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.956 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f367d02c-9299-4af6-b514-b79714c63943]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.971 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[5e231e7c-cbc3-4ced-a296-d790d29e9724]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 483156, 'reachable_time': 42063, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210862, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.974 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:35:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:44.974 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf2aae3-c2e8-4d0e-a127-3913a49d1d63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:35:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d94cafcd0\x2dc7c2\x2d48b4\x2da2dd\x2d21c16ce48dc4.mount: Deactivated successfully.
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.980 186180 DEBUG nova.compute.manager [req-37821341-6290-4635-b0d2-ebada3b92ec5 req-45591d10-4976-4676-9680-5a306ebe3494 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-unplugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.981 186180 DEBUG oslo_concurrency.lockutils [req-37821341-6290-4635-b0d2-ebada3b92ec5 req-45591d10-4976-4676-9680-5a306ebe3494 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.981 186180 DEBUG oslo_concurrency.lockutils [req-37821341-6290-4635-b0d2-ebada3b92ec5 req-45591d10-4976-4676-9680-5a306ebe3494 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.981 186180 DEBUG oslo_concurrency.lockutils [req-37821341-6290-4635-b0d2-ebada3b92ec5 req-45591d10-4976-4676-9680-5a306ebe3494 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.981 186180 DEBUG nova.compute.manager [req-37821341-6290-4635-b0d2-ebada3b92ec5 req-45591d10-4976-4676-9680-5a306ebe3494 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] No waiting events found dispatching network-vif-unplugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:35:44 compute-0 nova_compute[186176]: 2026-02-16 17:35:44.982 186180 DEBUG nova.compute.manager [req-37821341-6290-4635-b0d2-ebada3b92ec5 req-45591d10-4976-4676-9680-5a306ebe3494 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-unplugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:35:45 compute-0 nova_compute[186176]: 2026-02-16 17:35:45.129 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:45 compute-0 nova_compute[186176]: 2026-02-16 17:35:45.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:35:46 compute-0 nova_compute[186176]: 2026-02-16 17:35:46.148 186180 DEBUG nova.compute.manager [req-360e9f7d-1404-48d4-9d20-f77d8500f2a5 req-2595fb0e-850e-49ba-acd1-65c58851c90e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-unplugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:35:46 compute-0 nova_compute[186176]: 2026-02-16 17:35:46.149 186180 DEBUG oslo_concurrency.lockutils [req-360e9f7d-1404-48d4-9d20-f77d8500f2a5 req-2595fb0e-850e-49ba-acd1-65c58851c90e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:46 compute-0 nova_compute[186176]: 2026-02-16 17:35:46.150 186180 DEBUG oslo_concurrency.lockutils [req-360e9f7d-1404-48d4-9d20-f77d8500f2a5 req-2595fb0e-850e-49ba-acd1-65c58851c90e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:46 compute-0 nova_compute[186176]: 2026-02-16 17:35:46.150 186180 DEBUG oslo_concurrency.lockutils [req-360e9f7d-1404-48d4-9d20-f77d8500f2a5 req-2595fb0e-850e-49ba-acd1-65c58851c90e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:46 compute-0 nova_compute[186176]: 2026-02-16 17:35:46.151 186180 DEBUG nova.compute.manager [req-360e9f7d-1404-48d4-9d20-f77d8500f2a5 req-2595fb0e-850e-49ba-acd1-65c58851c90e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] No waiting events found dispatching network-vif-unplugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:35:46 compute-0 nova_compute[186176]: 2026-02-16 17:35:46.151 186180 DEBUG nova.compute.manager [req-360e9f7d-1404-48d4-9d20-f77d8500f2a5 req-2595fb0e-850e-49ba-acd1-65c58851c90e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-unplugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:35:46 compute-0 nova_compute[186176]: 2026-02-16 17:35:46.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.025 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.245 186180 DEBUG nova.compute.manager [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.246 186180 DEBUG oslo_concurrency.lockutils [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.247 186180 DEBUG oslo_concurrency.lockutils [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.247 186180 DEBUG oslo_concurrency.lockutils [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.247 186180 DEBUG nova.compute.manager [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] No waiting events found dispatching network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.248 186180 WARNING nova.compute.manager [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received unexpected event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 for instance with vm_state active and task_state migrating.
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.248 186180 DEBUG nova.compute.manager [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.248 186180 DEBUG oslo_concurrency.lockutils [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.249 186180 DEBUG oslo_concurrency.lockutils [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.249 186180 DEBUG oslo_concurrency.lockutils [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.249 186180 DEBUG nova.compute.manager [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] No waiting events found dispatching network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.250 186180 WARNING nova.compute.manager [req-ff3f378e-cc6a-4da9-a19e-b5ff63927209 req-969d68a8-d3ab-40a1-a220-adf68c3e4db3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received unexpected event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 for instance with vm_state active and task_state migrating.
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.327 186180 DEBUG nova.network.neutron [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updated VIF entry in instance network info cache for port 41e1a19e-a3c4-4930-8b5f-d197049955d2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.328 186180 DEBUG nova.network.neutron [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Updating instance_info_cache with network_info: [{"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.349 186180 DEBUG oslo_concurrency.lockutils [req-a3868da5-28b5-4306-9540-584225791826 req-f0596392-be91-4b59-94a3-630ca0e0d2ad 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-e4c1ab36-37d0-4a70-b99c-cd2bb7707c39" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.380 186180 DEBUG nova.network.neutron [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Activated binding for port 41e1a19e-a3c4-4930-8b5f-d197049955d2 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.381 186180 DEBUG nova.compute.manager [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.383 186180 DEBUG nova.virt.libvirt.vif [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:33:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1142233594',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1142233594',id=11,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:33:56Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-grava8bh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:35:29Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=e4c1ab36-37d0-4a70-b99c-cd2bb7707c39,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.383 186180 DEBUG nova.network.os_vif_util [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "address": "fa:16:3e:7c:29:3f", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41e1a19e-a3", "ovs_interfaceid": "41e1a19e-a3c4-4930-8b5f-d197049955d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.384 186180 DEBUG nova.network.os_vif_util [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:29:3f,bridge_name='br-int',has_traffic_filtering=True,id=41e1a19e-a3c4-4930-8b5f-d197049955d2,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41e1a19e-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.385 186180 DEBUG os_vif [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:29:3f,bridge_name='br-int',has_traffic_filtering=True,id=41e1a19e-a3c4-4930-8b5f-d197049955d2,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41e1a19e-a3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.389 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.389 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41e1a19e-a3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.392 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.395 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.398 186180 INFO os_vif [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:29:3f,bridge_name='br-int',has_traffic_filtering=True,id=41e1a19e-a3c4-4930-8b5f-d197049955d2,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41e1a19e-a3')
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.399 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.399 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.400 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.400 186180 DEBUG nova.compute.manager [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.401 186180 INFO nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Deleting instance files /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39_del
Feb 16 17:35:47 compute-0 nova_compute[186176]: 2026-02-16 17:35:47.402 186180 INFO nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Deletion of /var/lib/nova/instances/e4c1ab36-37d0-4a70-b99c-cd2bb7707c39_del complete
Feb 16 17:35:48 compute-0 nova_compute[186176]: 2026-02-16 17:35:48.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.339 186180 DEBUG nova.compute.manager [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.340 186180 DEBUG oslo_concurrency.lockutils [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.341 186180 DEBUG oslo_concurrency.lockutils [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.341 186180 DEBUG oslo_concurrency.lockutils [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.341 186180 DEBUG nova.compute.manager [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] No waiting events found dispatching network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.342 186180 WARNING nova.compute.manager [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received unexpected event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 for instance with vm_state active and task_state migrating.
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.342 186180 DEBUG nova.compute.manager [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.343 186180 DEBUG oslo_concurrency.lockutils [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.343 186180 DEBUG oslo_concurrency.lockutils [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.343 186180 DEBUG oslo_concurrency.lockutils [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.344 186180 DEBUG nova.compute.manager [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] No waiting events found dispatching network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:35:49 compute-0 nova_compute[186176]: 2026-02-16 17:35:49.344 186180 WARNING nova.compute.manager [req-e63a8044-2e25-4edc-82c3-be9ab564d12d req-2c123b1f-17b8-4561-8691-4c830f75d093 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Received unexpected event network-vif-plugged-41e1a19e-a3c4-4930-8b5f-d197049955d2 for instance with vm_state active and task_state migrating.
Feb 16 17:35:51 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 17:35:51 compute-0 systemd[210691]: Activating special unit Exit the Session...
Feb 16 17:35:51 compute-0 systemd[210691]: Stopped target Main User Target.
Feb 16 17:35:51 compute-0 systemd[210691]: Stopped target Basic System.
Feb 16 17:35:51 compute-0 systemd[210691]: Stopped target Paths.
Feb 16 17:35:51 compute-0 systemd[210691]: Stopped target Sockets.
Feb 16 17:35:51 compute-0 systemd[210691]: Stopped target Timers.
Feb 16 17:35:51 compute-0 systemd[210691]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:35:51 compute-0 systemd[210691]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 17:35:51 compute-0 systemd[210691]: Closed D-Bus User Message Bus Socket.
Feb 16 17:35:51 compute-0 systemd[210691]: Stopped Create User's Volatile Files and Directories.
Feb 16 17:35:51 compute-0 systemd[210691]: Removed slice User Application Slice.
Feb 16 17:35:51 compute-0 systemd[210691]: Reached target Shutdown.
Feb 16 17:35:51 compute-0 systemd[210691]: Finished Exit the Session.
Feb 16 17:35:51 compute-0 systemd[210691]: Reached target Exit the Session.
Feb 16 17:35:51 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 17:35:51 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 17:35:51 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 17:35:51 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 17:35:51 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 17:35:51 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 17:35:51 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 17:35:51 compute-0 nova_compute[186176]: 2026-02-16 17:35:51.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:35:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:35:51.804 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.079 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:52 compute-0 podman[210865]: 2026-02-16 17:35:52.130893424 +0000 UTC m=+0.052720632 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:35:52 compute-0 podman[210864]: 2026-02-16 17:35:52.181243255 +0000 UTC m=+0.096432388 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.334 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.334 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.335 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e4c1ab36-37d0-4a70-b99c-cd2bb7707c39-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.354 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.354 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.355 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.355 186180 DEBUG nova.compute.resource_tracker [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.392 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.569 186180 WARNING nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.571 186180 DEBUG nova.compute.resource_tracker [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5825MB free_disk=73.22386932373047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.571 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.572 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.629 186180 DEBUG nova.compute.resource_tracker [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Migration for instance e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.650 186180 DEBUG nova.compute.resource_tracker [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.679 186180 DEBUG nova.compute.resource_tracker [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Migration 65609735-7a1b-400f-9b0f-ea334c772b0d is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.679 186180 DEBUG nova.compute.resource_tracker [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.680 186180 DEBUG nova.compute.resource_tracker [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.725 186180 DEBUG nova.compute.provider_tree [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.742 186180 DEBUG nova.scheduler.client.report [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.769 186180 DEBUG nova.compute.resource_tracker [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.770 186180 DEBUG oslo_concurrency.lockutils [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.781 186180 INFO nova.compute.manager [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.877 186180 INFO nova.scheduler.client.report [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Deleted allocation for migration 65609735-7a1b-400f-9b0f-ea334c772b0d
Feb 16 17:35:52 compute-0 nova_compute[186176]: 2026-02-16 17:35:52.878 186180 DEBUG nova.virt.libvirt.driver [None req-11050660-64fe-482f-8bb4-69f041318c5b b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 17:35:57 compute-0 nova_compute[186176]: 2026-02-16 17:35:57.130 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:57 compute-0 nova_compute[186176]: 2026-02-16 17:35:57.394 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:35:59 compute-0 podman[195505]: time="2026-02-16T17:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:35:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:35:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 17:35:59 compute-0 nova_compute[186176]: 2026-02-16 17:35:59.840 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771263344.8382971, e4c1ab36-37d0-4a70-b99c-cd2bb7707c39 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:35:59 compute-0 nova_compute[186176]: 2026-02-16 17:35:59.841 186180 INFO nova.compute.manager [-] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] VM Stopped (Lifecycle Event)
Feb 16 17:35:59 compute-0 nova_compute[186176]: 2026-02-16 17:35:59.867 186180 DEBUG nova.compute.manager [None req-d425d9bf-8276-4056-9a3a-7a97e2889f3e - - - - - -] [instance: e4c1ab36-37d0-4a70-b99c-cd2bb7707c39] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:36:01 compute-0 openstack_network_exporter[198360]: ERROR   17:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:36:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:36:01 compute-0 openstack_network_exporter[198360]: ERROR   17:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:36:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:36:02 compute-0 nova_compute[186176]: 2026-02-16 17:36:02.132 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:02 compute-0 nova_compute[186176]: 2026-02-16 17:36:02.396 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:07 compute-0 nova_compute[186176]: 2026-02-16 17:36:07.398 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:36:07 compute-0 nova_compute[186176]: 2026-02-16 17:36:07.400 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:36:07 compute-0 nova_compute[186176]: 2026-02-16 17:36:07.400 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:36:07 compute-0 nova_compute[186176]: 2026-02-16 17:36:07.400 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:36:08 compute-0 nova_compute[186176]: 2026-02-16 17:36:08.018 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:08 compute-0 nova_compute[186176]: 2026-02-16 17:36:08.019 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:36:08 compute-0 nova_compute[186176]: 2026-02-16 17:36:08.020 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:13 compute-0 nova_compute[186176]: 2026-02-16 17:36:13.021 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:13 compute-0 nova_compute[186176]: 2026-02-16 17:36:13.028 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:36:14 compute-0 podman[210913]: 2026-02-16 17:36:14.101914626 +0000 UTC m=+0.072011471 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, managed_by=edpm_ansible, version=9.7, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 17:36:15 compute-0 podman[210934]: 2026-02-16 17:36:15.080844865 +0000 UTC m=+0.055015808 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 17:36:16 compute-0 nova_compute[186176]: 2026-02-16 17:36:16.900 186180 DEBUG nova.compute.manager [None req-0b10e80c-c09a-4eb1-933c-97d76bcc759b 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider bb904aac-529f-46ef-9861-9c655a4b383c in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 16 17:36:16 compute-0 nova_compute[186176]: 2026-02-16 17:36:16.958 186180 DEBUG nova.compute.provider_tree [None req-0b10e80c-c09a-4eb1-933c-97d76bcc759b 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Updating resource provider bb904aac-529f-46ef-9861-9c655a4b383c generation from 16 to 19 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 17:36:18 compute-0 nova_compute[186176]: 2026-02-16 17:36:18.025 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:18 compute-0 nova_compute[186176]: 2026-02-16 17:36:18.028 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:23 compute-0 nova_compute[186176]: 2026-02-16 17:36:23.027 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:23 compute-0 nova_compute[186176]: 2026-02-16 17:36:23.029 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:23 compute-0 podman[210956]: 2026-02-16 17:36:23.086394649 +0000 UTC m=+0.051624984 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 17:36:23 compute-0 podman[210955]: 2026-02-16 17:36:23.118140668 +0000 UTC m=+0.084852990 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:36:28 compute-0 nova_compute[186176]: 2026-02-16 17:36:28.028 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:28 compute-0 nova_compute[186176]: 2026-02-16 17:36:28.031 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:29 compute-0 ovn_controller[96437]: 2026-02-16T17:36:29Z|00115|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Feb 16 17:36:29 compute-0 podman[195505]: time="2026-02-16T17:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:36:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:36:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.065 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.066 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.083 186180 DEBUG nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.178 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.179 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.185 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.186 186180 INFO nova.compute.claims [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.276 186180 DEBUG nova.compute.provider_tree [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.289 186180 DEBUG nova.scheduler.client.report [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.313 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.314 186180 DEBUG nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.358 186180 DEBUG nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.359 186180 DEBUG nova.network.neutron [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.376 186180 INFO nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.393 186180 DEBUG nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:36:31 compute-0 openstack_network_exporter[198360]: ERROR   17:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:36:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:36:31 compute-0 openstack_network_exporter[198360]: ERROR   17:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:36:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.493 186180 DEBUG nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.494 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.494 186180 INFO nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Creating image(s)
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.495 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "/var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.495 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.496 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.510 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.552 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.553 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.554 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.564 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.604 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.605 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.634 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.635 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.635 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.681 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.682 186180 DEBUG nova.virt.disk.api [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Checking if we can resize image /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.682 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.728 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.730 186180 DEBUG nova.virt.disk.api [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Cannot resize image /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.730 186180 DEBUG nova.objects.instance [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'migration_context' on Instance uuid 1fa1c686-a82d-4522-8330-1c9cdf431cc1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.745 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.746 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Ensure instance console log exists: /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.746 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.747 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.748 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:31 compute-0 nova_compute[186176]: 2026-02-16 17:36:31.780 186180 DEBUG nova.policy [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54934f49b2044289bcf127662fe114b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:36:32 compute-0 nova_compute[186176]: 2026-02-16 17:36:32.368 186180 DEBUG nova.network.neutron [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Successfully created port: a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:36:33 compute-0 nova_compute[186176]: 2026-02-16 17:36:33.030 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:33 compute-0 nova_compute[186176]: 2026-02-16 17:36:33.783 186180 DEBUG nova.network.neutron [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Successfully updated port: a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:36:33 compute-0 nova_compute[186176]: 2026-02-16 17:36:33.819 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:36:33 compute-0 nova_compute[186176]: 2026-02-16 17:36:33.820 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquired lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:36:33 compute-0 nova_compute[186176]: 2026-02-16 17:36:33.820 186180 DEBUG nova.network.neutron [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:36:33 compute-0 nova_compute[186176]: 2026-02-16 17:36:33.952 186180 DEBUG nova.compute.manager [req-31a12449-a7c8-416b-b35c-beed76f995e3 req-baa5b0f0-8df9-469b-83d5-eb9facba3040 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-changed-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:36:33 compute-0 nova_compute[186176]: 2026-02-16 17:36:33.953 186180 DEBUG nova.compute.manager [req-31a12449-a7c8-416b-b35c-beed76f995e3 req-baa5b0f0-8df9-469b-83d5-eb9facba3040 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Refreshing instance network info cache due to event network-changed-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:36:33 compute-0 nova_compute[186176]: 2026-02-16 17:36:33.953 186180 DEBUG oslo_concurrency.lockutils [req-31a12449-a7c8-416b-b35c-beed76f995e3 req-baa5b0f0-8df9-469b-83d5-eb9facba3040 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.060 186180 DEBUG nova.network.neutron [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.768 186180 DEBUG nova.network.neutron [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Updating instance_info_cache with network_info: [{"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.790 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Releasing lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.791 186180 DEBUG nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Instance network_info: |[{"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.792 186180 DEBUG oslo_concurrency.lockutils [req-31a12449-a7c8-416b-b35c-beed76f995e3 req-baa5b0f0-8df9-469b-83d5-eb9facba3040 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.792 186180 DEBUG nova.network.neutron [req-31a12449-a7c8-416b-b35c-beed76f995e3 req-baa5b0f0-8df9-469b-83d5-eb9facba3040 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Refreshing network info cache for port a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.797 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Start _get_guest_xml network_info=[{"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.805 186180 WARNING nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.810 186180 DEBUG nova.virt.libvirt.host [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.811 186180 DEBUG nova.virt.libvirt.host [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.818 186180 DEBUG nova.virt.libvirt.host [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.819 186180 DEBUG nova.virt.libvirt.host [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.820 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.821 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.821 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.821 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.822 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.822 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.822 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.822 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.823 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.823 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.823 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.823 186180 DEBUG nova.virt.hardware [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.827 186180 DEBUG nova.virt.libvirt.vif [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:36:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-866468319',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-866468319',id=13,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-wdp6ej9p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:36:31Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=1fa1c686-a82d-4522-8330-1c9cdf431cc1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.828 186180 DEBUG nova.network.os_vif_util [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.829 186180 DEBUG nova.network.os_vif_util [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:b9:77,bridge_name='br-int',has_traffic_filtering=True,id=a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa39e1965-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.829 186180 DEBUG nova.objects.instance [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'pci_devices' on Instance uuid 1fa1c686-a82d-4522-8330-1c9cdf431cc1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.849 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <uuid>1fa1c686-a82d-4522-8330-1c9cdf431cc1</uuid>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <name>instance-0000000d</name>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteStrategies-server-866468319</nova:name>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:36:34</nova:creationTime>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:36:34 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:36:34 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:36:34 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:36:34 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:36:34 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:36:34 compute-0 nova_compute[186176]:         <nova:user uuid="c54934f49b2044289bcf127662fe114b">tempest-TestExecuteStrategies-1098930400-project-member</nova:user>
Feb 16 17:36:34 compute-0 nova_compute[186176]:         <nova:project uuid="1a237c4b00c5426cb1dc6afe3c7c868c">tempest-TestExecuteStrategies-1098930400</nova:project>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:36:34 compute-0 nova_compute[186176]:         <nova:port uuid="a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c">
Feb 16 17:36:34 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <system>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <entry name="serial">1fa1c686-a82d-4522-8330-1c9cdf431cc1</entry>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <entry name="uuid">1fa1c686-a82d-4522-8330-1c9cdf431cc1</entry>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     </system>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <os>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   </os>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <features>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   </features>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk.config"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:86:b9:77"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <target dev="tapa39e1965-a5"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/console.log" append="off"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <video>
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     </video>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:36:34 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:36:34 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:36:34 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:36:34 compute-0 nova_compute[186176]: </domain>
Feb 16 17:36:34 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.851 186180 DEBUG nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Preparing to wait for external event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.851 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.851 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.851 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.852 186180 DEBUG nova.virt.libvirt.vif [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:36:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-866468319',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-866468319',id=13,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-wdp6ej9p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:36:31Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=1fa1c686-a82d-4522-8330-1c9cdf431cc1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.853 186180 DEBUG nova.network.os_vif_util [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.853 186180 DEBUG nova.network.os_vif_util [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:86:b9:77,bridge_name='br-int',has_traffic_filtering=True,id=a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa39e1965-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.854 186180 DEBUG os_vif [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:b9:77,bridge_name='br-int',has_traffic_filtering=True,id=a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa39e1965-a5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.854 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.855 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.855 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.858 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.859 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa39e1965-a5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.859 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa39e1965-a5, col_values=(('external_ids', {'iface-id': 'a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:86:b9:77', 'vm-uuid': '1fa1c686-a82d-4522-8330-1c9cdf431cc1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.861 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:34 compute-0 NetworkManager[56463]: <info>  [1771263394.8624] manager: (tapa39e1965-a5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.863 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.867 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.867 186180 INFO os_vif [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:86:b9:77,bridge_name='br-int',has_traffic_filtering=True,id=a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa39e1965-a5')
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.925 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.926 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.926 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No VIF found with MAC fa:16:3e:86:b9:77, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:36:34 compute-0 nova_compute[186176]: 2026-02-16 17:36:34.926 186180 INFO nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Using config drive
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.182 186180 INFO nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Creating config drive at /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk.config
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.187 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpkw4_kfi_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.310 186180 DEBUG oslo_concurrency.processutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpkw4_kfi_" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:36:35 compute-0 kernel: tapa39e1965-a5: entered promiscuous mode
Feb 16 17:36:35 compute-0 NetworkManager[56463]: <info>  [1771263395.3772] manager: (tapa39e1965-a5): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Feb 16 17:36:35 compute-0 ovn_controller[96437]: 2026-02-16T17:36:35Z|00116|binding|INFO|Claiming lport a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c for this chassis.
Feb 16 17:36:35 compute-0 ovn_controller[96437]: 2026-02-16T17:36:35Z|00117|binding|INFO|a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c: Claiming fa:16:3e:86:b9:77 10.100.0.11
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.378 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:35 compute-0 ovn_controller[96437]: 2026-02-16T17:36:35Z|00118|binding|INFO|Setting lport a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c ovn-installed in OVS
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.385 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.386 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:35 compute-0 ovn_controller[96437]: 2026-02-16T17:36:35Z|00119|binding|INFO|Setting lport a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c up in Southbound
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.389 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:b9:77 10.100.0.11'], port_security=['fa:16:3e:86:b9:77 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fa1c686-a82d-4522-8330-1c9cdf431cc1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.390 105730 INFO neutron.agent.ovn.metadata.agent [-] Port a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 bound to our chassis
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.392 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.392 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.403 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[10ba47de-ef0f-43c0-89c2-51dd274e5d6e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.404 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap94cafcd0-c1 in ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.406 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap94cafcd0-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.407 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[756b1397-5b94-4a14-ba80-b15d8357f756]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.408 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e80be92b-7e79-4816-b625-588c6f92fa21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 systemd-machined[155631]: New machine qemu-11-instance-0000000d.
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.422 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[9c3dcb8d-48d2-40b9-92f0-a3229d065021]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000d.
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.433 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[8563079a-1c60-4d01-862a-5699ccabd13a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 systemd-udevd[211040]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:36:35 compute-0 NetworkManager[56463]: <info>  [1771263395.4505] device (tapa39e1965-a5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:36:35 compute-0 NetworkManager[56463]: <info>  [1771263395.4513] device (tapa39e1965-a5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.464 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[75baa872-d23d-44cd-be68-0a3dd97bd8a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 systemd-udevd[211044]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:36:35 compute-0 NetworkManager[56463]: <info>  [1771263395.4697] manager: (tap94cafcd0-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.469 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2ca7d9-82a0-4dac-a468-d4cd50aa0211]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.503 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[3f385199-8db0-4df8-bfce-e007fdb86a5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.507 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d716f5-42c9-41a0-8786-ef60510ac7c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 NetworkManager[56463]: <info>  [1771263395.5331] device (tap94cafcd0-c0): carrier: link connected
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.539 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[e741a5aa-30a3-4942-bc4f-88cd124bb426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.556 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f8153719-533d-4236-a9f2-d996c6e55662]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499105, 'reachable_time': 34798, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211070, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.573 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[563a8eac-b86b-4b18-8793-712bf73b14da]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 499105, 'tstamp': 499105}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211071, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.585 186180 DEBUG nova.compute.manager [req-bd4e90d3-b6e7-456d-b110-3175f1db389a req-b46f2190-eb6f-42c8-93f5-11ef7ff6d3a7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.585 186180 DEBUG oslo_concurrency.lockutils [req-bd4e90d3-b6e7-456d-b110-3175f1db389a req-b46f2190-eb6f-42c8-93f5-11ef7ff6d3a7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.586 186180 DEBUG oslo_concurrency.lockutils [req-bd4e90d3-b6e7-456d-b110-3175f1db389a req-b46f2190-eb6f-42c8-93f5-11ef7ff6d3a7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.586 186180 DEBUG oslo_concurrency.lockutils [req-bd4e90d3-b6e7-456d-b110-3175f1db389a req-b46f2190-eb6f-42c8-93f5-11ef7ff6d3a7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.586 186180 DEBUG nova.compute.manager [req-bd4e90d3-b6e7-456d-b110-3175f1db389a req-b46f2190-eb6f-42c8-93f5-11ef7ff6d3a7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Processing event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.590 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[d509166a-8527-4e95-ba5a-d5c1c43c87e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499105, 'reachable_time': 34798, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211072, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.628 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab53a76-da72-466b-b479-a38649ab6d3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.692 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[abb488f8-7dcc-42df-b0be-b90220593b45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.695 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.695 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.696 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:36:35 compute-0 NetworkManager[56463]: <info>  [1771263395.6992] manager: (tap94cafcd0-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.698 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:35 compute-0 kernel: tap94cafcd0-c0: entered promiscuous mode
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.702 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.703 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.705 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:35 compute-0 ovn_controller[96437]: 2026-02-16T17:36:35Z|00120|binding|INFO|Releasing lport 5c28d585-b48c-40c6-b5e7-f1e59317b2de from this chassis (sb_readonly=0)
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.707 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.708 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[5f5f9d84-1404-49a3-adfa-eb6ec6174744]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.709 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:36:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:35.710 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'env', 'PROCESS_TAG=haproxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.712 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.746 186180 DEBUG nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.748 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263395.7475102, 1fa1c686-a82d-4522-8330-1c9cdf431cc1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.748 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] VM Started (Lifecycle Event)
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.753 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.756 186180 INFO nova.virt.libvirt.driver [-] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Instance spawned successfully.
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.757 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.768 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.773 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.777 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.777 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.778 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.778 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.778 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.779 186180 DEBUG nova.virt.libvirt.driver [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.801 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.803 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263395.7483, 1fa1c686-a82d-4522-8330-1c9cdf431cc1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.803 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] VM Paused (Lifecycle Event)
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.833 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.837 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263395.751938, 1fa1c686-a82d-4522-8330-1c9cdf431cc1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.837 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] VM Resumed (Lifecycle Event)
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.844 186180 INFO nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Took 4.35 seconds to spawn the instance on the hypervisor.
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.844 186180 DEBUG nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.860 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.863 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.898 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.907 186180 INFO nova.compute.manager [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Took 4.75 seconds to build instance.
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.920 186180 DEBUG nova.network.neutron [req-31a12449-a7c8-416b-b35c-beed76f995e3 req-baa5b0f0-8df9-469b-83d5-eb9facba3040 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Updated VIF entry in instance network info cache for port a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.921 186180 DEBUG nova.network.neutron [req-31a12449-a7c8-416b-b35c-beed76f995e3 req-baa5b0f0-8df9-469b-83d5-eb9facba3040 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Updating instance_info_cache with network_info: [{"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.923 186180 DEBUG oslo_concurrency.lockutils [None req-78d69c60-93a1-4063-adc7-56c029bc7c79 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:35 compute-0 nova_compute[186176]: 2026-02-16 17:36:35.938 186180 DEBUG oslo_concurrency.lockutils [req-31a12449-a7c8-416b-b35c-beed76f995e3 req-baa5b0f0-8df9-469b-83d5-eb9facba3040 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:36:36 compute-0 podman[211109]: 2026-02-16 17:36:36.116662784 +0000 UTC m=+0.065463389 container create 99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 17:36:36 compute-0 systemd[1]: Started libpod-conmon-99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e.scope.
Feb 16 17:36:36 compute-0 podman[211109]: 2026-02-16 17:36:36.084397452 +0000 UTC m=+0.033198087 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:36:36 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:36:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1da3c145a34468f23a1e13f43f9809fa13198a7fe827eb4ea4cd6a9d49229b6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:36:36 compute-0 podman[211109]: 2026-02-16 17:36:36.212379652 +0000 UTC m=+0.161180227 container init 99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 17:36:36 compute-0 podman[211109]: 2026-02-16 17:36:36.222893834 +0000 UTC m=+0.171694429 container start 99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 16 17:36:36 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211124]: [NOTICE]   (211128) : New worker (211130) forked
Feb 16 17:36:36 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211124]: [NOTICE]   (211128) : Loading success.
Feb 16 17:36:36 compute-0 nova_compute[186176]: 2026-02-16 17:36:36.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:36 compute-0 nova_compute[186176]: 2026-02-16 17:36:36.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 17:36:36 compute-0 nova_compute[186176]: 2026-02-16 17:36:36.335 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 17:36:37 compute-0 nova_compute[186176]: 2026-02-16 17:36:37.653 186180 DEBUG nova.compute.manager [req-f4ae1d08-287b-408a-a9f2-242ea68125e9 req-e28b87d8-fc95-4cab-9525-91ac8393e90f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:36:37 compute-0 nova_compute[186176]: 2026-02-16 17:36:37.655 186180 DEBUG oslo_concurrency.lockutils [req-f4ae1d08-287b-408a-a9f2-242ea68125e9 req-e28b87d8-fc95-4cab-9525-91ac8393e90f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:37 compute-0 nova_compute[186176]: 2026-02-16 17:36:37.655 186180 DEBUG oslo_concurrency.lockutils [req-f4ae1d08-287b-408a-a9f2-242ea68125e9 req-e28b87d8-fc95-4cab-9525-91ac8393e90f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:37 compute-0 nova_compute[186176]: 2026-02-16 17:36:37.656 186180 DEBUG oslo_concurrency.lockutils [req-f4ae1d08-287b-408a-a9f2-242ea68125e9 req-e28b87d8-fc95-4cab-9525-91ac8393e90f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:37 compute-0 nova_compute[186176]: 2026-02-16 17:36:37.656 186180 DEBUG nova.compute.manager [req-f4ae1d08-287b-408a-a9f2-242ea68125e9 req-e28b87d8-fc95-4cab-9525-91ac8393e90f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] No waiting events found dispatching network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:36:37 compute-0 nova_compute[186176]: 2026-02-16 17:36:37.657 186180 WARNING nova.compute.manager [req-f4ae1d08-287b-408a-a9f2-242ea68125e9 req-e28b87d8-fc95-4cab-9525-91ac8393e90f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received unexpected event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c for instance with vm_state active and task_state None.
Feb 16 17:36:38 compute-0 nova_compute[186176]: 2026-02-16 17:36:38.033 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:38.169 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:38.170 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:38.170 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:39 compute-0 nova_compute[186176]: 2026-02-16 17:36:39.864 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:41 compute-0 nova_compute[186176]: 2026-02-16 17:36:41.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:41 compute-0 nova_compute[186176]: 2026-02-16 17:36:41.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:36:41 compute-0 nova_compute[186176]: 2026-02-16 17:36:41.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:36:41 compute-0 nova_compute[186176]: 2026-02-16 17:36:41.773 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:36:41 compute-0 nova_compute[186176]: 2026-02-16 17:36:41.774 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:36:41 compute-0 nova_compute[186176]: 2026-02-16 17:36:41.774 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:36:41 compute-0 nova_compute[186176]: 2026-02-16 17:36:41.775 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1fa1c686-a82d-4522-8330-1c9cdf431cc1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:36:43 compute-0 nova_compute[186176]: 2026-02-16 17:36:43.081 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:43 compute-0 nova_compute[186176]: 2026-02-16 17:36:43.428 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Updating instance_info_cache with network_info: [{"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:36:43 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:43.453 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:36:43 compute-0 nova_compute[186176]: 2026-02-16 17:36:43.454 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:36:43 compute-0 nova_compute[186176]: 2026-02-16 17:36:43.455 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:36:43 compute-0 nova_compute[186176]: 2026-02-16 17:36:43.456 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:43 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:43.456 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:36:43 compute-0 nova_compute[186176]: 2026-02-16 17:36:43.458 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:43 compute-0 nova_compute[186176]: 2026-02-16 17:36:43.459 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.331 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.352 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.375 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.375 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.376 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.376 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.462 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:36:44 compute-0 podman[211140]: 2026-02-16 17:36:44.493152116 +0000 UTC m=+0.076428600 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, release=1770267347, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.518 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.520 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.592 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.739 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.740 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5647MB free_disk=73.22300720214844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.740 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.740 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.810 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance 1fa1c686-a82d-4522-8330-1c9cdf431cc1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.811 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.811 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.866 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.964 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:36:44 compute-0 nova_compute[186176]: 2026-02-16 17:36:44.979 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:36:45 compute-0 nova_compute[186176]: 2026-02-16 17:36:45.005 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:36:45 compute-0 nova_compute[186176]: 2026-02-16 17:36:45.006 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:45 compute-0 nova_compute[186176]: 2026-02-16 17:36:45.971 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:45 compute-0 nova_compute[186176]: 2026-02-16 17:36:45.972 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:45 compute-0 nova_compute[186176]: 2026-02-16 17:36:45.973 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:36:46 compute-0 podman[211167]: 2026-02-16 17:36:46.116092802 +0000 UTC m=+0.079413275 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:36:46 compute-0 ovn_controller[96437]: 2026-02-16T17:36:46Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:86:b9:77 10.100.0.11
Feb 16 17:36:46 compute-0 ovn_controller[96437]: 2026-02-16T17:36:46Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:86:b9:77 10.100.0.11
Feb 16 17:36:47 compute-0 nova_compute[186176]: 2026-02-16 17:36:47.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:47 compute-0 nova_compute[186176]: 2026-02-16 17:36:47.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:47 compute-0 nova_compute[186176]: 2026-02-16 17:36:47.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:47 compute-0 nova_compute[186176]: 2026-02-16 17:36:47.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 17:36:48 compute-0 nova_compute[186176]: 2026-02-16 17:36:48.084 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:49 compute-0 nova_compute[186176]: 2026-02-16 17:36:49.332 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:49 compute-0 nova_compute[186176]: 2026-02-16 17:36:49.872 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:50 compute-0 nova_compute[186176]: 2026-02-16 17:36:50.176 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:50 compute-0 nova_compute[186176]: 2026-02-16 17:36:50.193 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Triggering sync for uuid 1fa1c686-a82d-4522-8330-1c9cdf431cc1 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 17:36:50 compute-0 nova_compute[186176]: 2026-02-16 17:36:50.194 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:36:50 compute-0 nova_compute[186176]: 2026-02-16 17:36:50.195 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:36:50 compute-0 nova_compute[186176]: 2026-02-16 17:36:50.230 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:36:53 compute-0 nova_compute[186176]: 2026-02-16 17:36:53.088 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:53 compute-0 nova_compute[186176]: 2026-02-16 17:36:53.335 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:36:53 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:36:53.459 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:36:54 compute-0 podman[211204]: 2026-02-16 17:36:54.113175536 +0000 UTC m=+0.070864712 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:36:54 compute-0 podman[211203]: 2026-02-16 17:36:54.150438923 +0000 UTC m=+0.110757394 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:36:54 compute-0 nova_compute[186176]: 2026-02-16 17:36:54.874 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:58 compute-0 nova_compute[186176]: 2026-02-16 17:36:58.091 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:36:59 compute-0 podman[195505]: time="2026-02-16T17:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:36:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:36:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Feb 16 17:36:59 compute-0 nova_compute[186176]: 2026-02-16 17:36:59.876 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:01 compute-0 openstack_network_exporter[198360]: ERROR   17:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:37:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:37:01 compute-0 openstack_network_exporter[198360]: ERROR   17:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:37:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:37:03 compute-0 nova_compute[186176]: 2026-02-16 17:37:03.149 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:04 compute-0 nova_compute[186176]: 2026-02-16 17:37:04.879 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:08 compute-0 nova_compute[186176]: 2026-02-16 17:37:08.152 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:09 compute-0 nova_compute[186176]: 2026-02-16 17:37:09.883 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:10 compute-0 nova_compute[186176]: 2026-02-16 17:37:10.707 186180 DEBUG nova.compute.manager [None req-508bce57-dea4-4998-bfbe-039bd4898167 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider bb904aac-529f-46ef-9861-9c655a4b383c in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 16 17:37:10 compute-0 nova_compute[186176]: 2026-02-16 17:37:10.754 186180 DEBUG nova.compute.provider_tree [None req-508bce57-dea4-4998-bfbe-039bd4898167 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Updating resource provider bb904aac-529f-46ef-9861-9c655a4b383c generation from 19 to 21 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 17:37:13 compute-0 nova_compute[186176]: 2026-02-16 17:37:13.155 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:13 compute-0 ovn_controller[96437]: 2026-02-16T17:37:13Z|00121|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Feb 16 17:37:14 compute-0 nova_compute[186176]: 2026-02-16 17:37:14.886 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:15 compute-0 podman[211251]: 2026-02-16 17:37:15.106151322 +0000 UTC m=+0.074628326 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, release=1770267347, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter)
Feb 16 17:37:15 compute-0 nova_compute[186176]: 2026-02-16 17:37:15.264 186180 DEBUG nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Check if temp file /var/lib/nova/instances/tmpecenznre exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 17:37:15 compute-0 nova_compute[186176]: 2026-02-16 17:37:15.265 186180 DEBUG nova.compute.manager [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpecenznre',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1fa1c686-a82d-4522-8330-1c9cdf431cc1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 17:37:15 compute-0 nova_compute[186176]: 2026-02-16 17:37:15.809 186180 DEBUG oslo_concurrency.processutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:37:15 compute-0 nova_compute[186176]: 2026-02-16 17:37:15.868 186180 DEBUG oslo_concurrency.processutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:37:15 compute-0 nova_compute[186176]: 2026-02-16 17:37:15.869 186180 DEBUG oslo_concurrency.processutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:37:15 compute-0 nova_compute[186176]: 2026-02-16 17:37:15.944 186180 DEBUG oslo_concurrency.processutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:37:17 compute-0 podman[211280]: 2026-02-16 17:37:17.088267324 +0000 UTC m=+0.056021853 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Feb 16 17:37:18 compute-0 nova_compute[186176]: 2026-02-16 17:37:18.196 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:18 compute-0 sshd-session[211299]: Accepted publickey for nova from 192.168.122.101 port 57332 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:37:18 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 17:37:18 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 17:37:18 compute-0 systemd-logind[821]: New session 35 of user nova.
Feb 16 17:37:18 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 17:37:18 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 17:37:18 compute-0 systemd[211303]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:37:19 compute-0 systemd[211303]: Queued start job for default target Main User Target.
Feb 16 17:37:19 compute-0 systemd[211303]: Created slice User Application Slice.
Feb 16 17:37:19 compute-0 systemd[211303]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:37:19 compute-0 systemd[211303]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 17:37:19 compute-0 systemd[211303]: Reached target Paths.
Feb 16 17:37:19 compute-0 systemd[211303]: Reached target Timers.
Feb 16 17:37:19 compute-0 systemd[211303]: Starting D-Bus User Message Bus Socket...
Feb 16 17:37:19 compute-0 systemd[211303]: Starting Create User's Volatile Files and Directories...
Feb 16 17:37:19 compute-0 systemd[211303]: Listening on D-Bus User Message Bus Socket.
Feb 16 17:37:19 compute-0 systemd[211303]: Reached target Sockets.
Feb 16 17:37:19 compute-0 systemd[211303]: Finished Create User's Volatile Files and Directories.
Feb 16 17:37:19 compute-0 systemd[211303]: Reached target Basic System.
Feb 16 17:37:19 compute-0 systemd[211303]: Reached target Main User Target.
Feb 16 17:37:19 compute-0 systemd[211303]: Startup finished in 112ms.
Feb 16 17:37:19 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 17:37:19 compute-0 systemd[1]: Started Session 35 of User nova.
Feb 16 17:37:19 compute-0 sshd-session[211299]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:37:19 compute-0 sshd-session[211318]: Received disconnect from 192.168.122.101 port 57332:11: disconnected by user
Feb 16 17:37:19 compute-0 sshd-session[211318]: Disconnected from user nova 192.168.122.101 port 57332
Feb 16 17:37:19 compute-0 sshd-session[211299]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:37:19 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Feb 16 17:37:19 compute-0 systemd-logind[821]: Session 35 logged out. Waiting for processes to exit.
Feb 16 17:37:19 compute-0 systemd-logind[821]: Removed session 35.
Feb 16 17:37:19 compute-0 nova_compute[186176]: 2026-02-16 17:37:19.890 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:20 compute-0 nova_compute[186176]: 2026-02-16 17:37:20.179 186180 DEBUG nova.compute.manager [req-108116b0-d045-4dff-9f21-37a22764b54f req-4771c535-5daa-44ad-91ff-0c548a260539 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-unplugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:37:20 compute-0 nova_compute[186176]: 2026-02-16 17:37:20.180 186180 DEBUG oslo_concurrency.lockutils [req-108116b0-d045-4dff-9f21-37a22764b54f req-4771c535-5daa-44ad-91ff-0c548a260539 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:20 compute-0 nova_compute[186176]: 2026-02-16 17:37:20.180 186180 DEBUG oslo_concurrency.lockutils [req-108116b0-d045-4dff-9f21-37a22764b54f req-4771c535-5daa-44ad-91ff-0c548a260539 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:20 compute-0 nova_compute[186176]: 2026-02-16 17:37:20.181 186180 DEBUG oslo_concurrency.lockutils [req-108116b0-d045-4dff-9f21-37a22764b54f req-4771c535-5daa-44ad-91ff-0c548a260539 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:20 compute-0 nova_compute[186176]: 2026-02-16 17:37:20.181 186180 DEBUG nova.compute.manager [req-108116b0-d045-4dff-9f21-37a22764b54f req-4771c535-5daa-44ad-91ff-0c548a260539 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] No waiting events found dispatching network-vif-unplugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:37:20 compute-0 nova_compute[186176]: 2026-02-16 17:37:20.182 186180 DEBUG nova.compute.manager [req-108116b0-d045-4dff-9f21-37a22764b54f req-4771c535-5daa-44ad-91ff-0c548a260539 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-unplugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:37:20 compute-0 nova_compute[186176]: 2026-02-16 17:37:20.989 186180 INFO nova.compute.manager [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Took 5.04 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 17:37:20 compute-0 nova_compute[186176]: 2026-02-16 17:37:20.989 186180 DEBUG nova.compute.manager [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.004 186180 DEBUG nova.compute.manager [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpecenznre',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='1fa1c686-a82d-4522-8330-1c9cdf431cc1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(5fa437cf-9377-4ce5-b738-9affc3b0c5c4),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.025 186180 DEBUG nova.objects.instance [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid 1fa1c686-a82d-4522-8330-1c9cdf431cc1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.026 186180 DEBUG nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.027 186180 DEBUG nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.027 186180 DEBUG nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.042 186180 DEBUG nova.virt.libvirt.vif [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:36:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-866468319',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-866468319',id=13,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:36:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-wdp6ej9p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:36:35Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=1fa1c686-a82d-4522-8330-1c9cdf431cc1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.043 186180 DEBUG nova.network.os_vif_util [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.044 186180 DEBUG nova.network.os_vif_util [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:b9:77,bridge_name='br-int',has_traffic_filtering=True,id=a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa39e1965-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.045 186180 DEBUG nova.virt.libvirt.migration [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 17:37:21 compute-0 nova_compute[186176]:   <mac address="fa:16:3e:86:b9:77"/>
Feb 16 17:37:21 compute-0 nova_compute[186176]:   <model type="virtio"/>
Feb 16 17:37:21 compute-0 nova_compute[186176]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:37:21 compute-0 nova_compute[186176]:   <mtu size="1442"/>
Feb 16 17:37:21 compute-0 nova_compute[186176]:   <target dev="tapa39e1965-a5"/>
Feb 16 17:37:21 compute-0 nova_compute[186176]: </interface>
Feb 16 17:37:21 compute-0 nova_compute[186176]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.046 186180 DEBUG nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.531 186180 DEBUG nova.virt.libvirt.migration [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.531 186180 INFO nova.virt.libvirt.migration [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 17:37:21 compute-0 nova_compute[186176]: 2026-02-16 17:37:21.630 186180 INFO nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.168 186180 DEBUG nova.virt.libvirt.migration [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.168 186180 DEBUG nova.virt.libvirt.migration [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.245 186180 DEBUG nova.compute.manager [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.246 186180 DEBUG oslo_concurrency.lockutils [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.246 186180 DEBUG oslo_concurrency.lockutils [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.246 186180 DEBUG oslo_concurrency.lockutils [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.246 186180 DEBUG nova.compute.manager [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] No waiting events found dispatching network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.247 186180 WARNING nova.compute.manager [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received unexpected event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c for instance with vm_state active and task_state migrating.
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.247 186180 DEBUG nova.compute.manager [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-changed-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.247 186180 DEBUG nova.compute.manager [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Refreshing instance network info cache due to event network-changed-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.247 186180 DEBUG oslo_concurrency.lockutils [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.247 186180 DEBUG oslo_concurrency.lockutils [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.248 186180 DEBUG nova.network.neutron [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Refreshing network info cache for port a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.672 186180 DEBUG nova.virt.libvirt.migration [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.673 186180 DEBUG nova.virt.libvirt.migration [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.719 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263442.7193573, 1fa1c686-a82d-4522-8330-1c9cdf431cc1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.720 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] VM Paused (Lifecycle Event)
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.743 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.748 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.769 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 17:37:22 compute-0 kernel: tapa39e1965-a5 (unregistering): left promiscuous mode
Feb 16 17:37:22 compute-0 NetworkManager[56463]: <info>  [1771263442.8569] device (tapa39e1965-a5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.857 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.864 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:22 compute-0 ovn_controller[96437]: 2026-02-16T17:37:22Z|00122|binding|INFO|Releasing lport a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c from this chassis (sb_readonly=0)
Feb 16 17:37:22 compute-0 ovn_controller[96437]: 2026-02-16T17:37:22Z|00123|binding|INFO|Setting lport a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c down in Southbound
Feb 16 17:37:22 compute-0 ovn_controller[96437]: 2026-02-16T17:37:22Z|00124|binding|INFO|Removing iface tapa39e1965-a5 ovn-installed in OVS
Feb 16 17:37:22 compute-0 nova_compute[186176]: 2026-02-16 17:37:22.873 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:22.882 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:86:b9:77 10.100.0.11'], port_security=['fa:16:3e:86:b9:77 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '1fa1c686-a82d-4522-8330-1c9cdf431cc1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:37:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:22.883 105730 INFO neutron.agent.ovn.metadata.agent [-] Port a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:37:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:22.884 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:37:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:22.887 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed6609a-764f-4fee-a79a-270aebe75289]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:37:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:22.887 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace which is not needed anymore
Feb 16 17:37:22 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Feb 16 17:37:22 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000d.scope: Consumed 13.776s CPU time.
Feb 16 17:37:22 compute-0 systemd-machined[155631]: Machine qemu-11-instance-0000000d terminated.
Feb 16 17:37:23 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211124]: [NOTICE]   (211128) : haproxy version is 2.8.14-c23fe91
Feb 16 17:37:23 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211124]: [NOTICE]   (211128) : path to executable is /usr/sbin/haproxy
Feb 16 17:37:23 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211124]: [WARNING]  (211128) : Exiting Master process...
Feb 16 17:37:23 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211124]: [ALERT]    (211128) : Current worker (211130) exited with code 143 (Terminated)
Feb 16 17:37:23 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211124]: [WARNING]  (211128) : All workers exited. Exiting... (0)
Feb 16 17:37:23 compute-0 systemd[1]: libpod-99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e.scope: Deactivated successfully.
Feb 16 17:37:23 compute-0 conmon[211124]: conmon 99cf1b9e08ba8b16bae7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e.scope/container/memory.events
Feb 16 17:37:23 compute-0 podman[211358]: 2026-02-16 17:37:23.056424583 +0000 UTC m=+0.058589917 container died 99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.088 186180 DEBUG nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 17:37:23 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e-userdata-shm.mount: Deactivated successfully.
Feb 16 17:37:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-a1da3c145a34468f23a1e13f43f9809fa13198a7fe827eb4ea4cd6a9d49229b6-merged.mount: Deactivated successfully.
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.089 186180 DEBUG nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.094 186180 DEBUG nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 17:37:23 compute-0 podman[211358]: 2026-02-16 17:37:23.100715314 +0000 UTC m=+0.102880648 container cleanup 99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 17:37:23 compute-0 systemd[1]: libpod-conmon-99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e.scope: Deactivated successfully.
Feb 16 17:37:23 compute-0 podman[211404]: 2026-02-16 17:37:23.175597395 +0000 UTC m=+0.052374483 container remove 99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.175 186180 DEBUG nova.virt.libvirt.guest [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '1fa1c686-a82d-4522-8330-1c9cdf431cc1' (instance-0000000d) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.176 186180 INFO nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Migration operation has completed
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.176 186180 INFO nova.compute.manager [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] _post_live_migration() is started..
Feb 16 17:37:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:23.181 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[d269f9e3-11fc-43ce-bec9-410dacf384cd]: (4, ('Mon Feb 16 05:37:22 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e)\n99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e\nMon Feb 16 05:37:23 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e)\n99cf1b9e08ba8b16bae7bb6923bbb6fe877526835cf6a812cb8a9be265ac669e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:37:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:23.183 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ac7c2e-d5c1-4573-bc2c-c31c53dcc882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:37:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:23.184 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.186 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:23 compute-0 kernel: tap94cafcd0-c0: left promiscuous mode
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.197 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.198 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:23.202 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ddb0d2c8-c9c8-462c-9333-24a5dcd39641]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:37:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:23.217 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[09be5df8-15d7-47e7-9a38-cffc3f3fb506]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:37:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:23.218 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[a930cc99-b48e-4596-9b93-5e6f7c7763c1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:37:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:23.235 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[c74af4d5-4072-4dd6-b164-4f3c7d83507f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499098, 'reachable_time': 17319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211423, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:37:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:23.239 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:37:23 compute-0 systemd[1]: run-netns-ovnmeta\x2d94cafcd0\x2dc7c2\x2d48b4\x2da2dd\x2d21c16ce48dc4.mount: Deactivated successfully.
Feb 16 17:37:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:23.239 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[d706c174-a4af-4b8d-afa6-5b1b49e7cf1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.301 186180 DEBUG nova.compute.manager [req-9fd76a90-2b99-4c3a-8afa-92402e75b0c1 req-7fc06ecf-c61f-450e-9b79-69c1a86b9b97 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-unplugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.301 186180 DEBUG oslo_concurrency.lockutils [req-9fd76a90-2b99-4c3a-8afa-92402e75b0c1 req-7fc06ecf-c61f-450e-9b79-69c1a86b9b97 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.302 186180 DEBUG oslo_concurrency.lockutils [req-9fd76a90-2b99-4c3a-8afa-92402e75b0c1 req-7fc06ecf-c61f-450e-9b79-69c1a86b9b97 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.303 186180 DEBUG oslo_concurrency.lockutils [req-9fd76a90-2b99-4c3a-8afa-92402e75b0c1 req-7fc06ecf-c61f-450e-9b79-69c1a86b9b97 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.304 186180 DEBUG nova.compute.manager [req-9fd76a90-2b99-4c3a-8afa-92402e75b0c1 req-7fc06ecf-c61f-450e-9b79-69c1a86b9b97 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] No waiting events found dispatching network-vif-unplugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.304 186180 DEBUG nova.compute.manager [req-9fd76a90-2b99-4c3a-8afa-92402e75b0c1 req-7fc06ecf-c61f-450e-9b79-69c1a86b9b97 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-unplugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.856 186180 DEBUG nova.network.neutron [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Updated VIF entry in instance network info cache for port a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.856 186180 DEBUG nova.network.neutron [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Updating instance_info_cache with network_info: [{"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:37:23 compute-0 nova_compute[186176]: 2026-02-16 17:37:23.900 186180 DEBUG oslo_concurrency.lockutils [req-c5eacb09-f1f1-4ff9-abd2-94a52834c01b req-9777a4b8-e919-4e87-b671-38efc831eed1 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-1fa1c686-a82d-4522-8330-1c9cdf431cc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.351 186180 DEBUG nova.compute.manager [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-unplugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.352 186180 DEBUG oslo_concurrency.lockutils [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.353 186180 DEBUG oslo_concurrency.lockutils [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.353 186180 DEBUG oslo_concurrency.lockutils [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.354 186180 DEBUG nova.compute.manager [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] No waiting events found dispatching network-vif-unplugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.354 186180 DEBUG nova.compute.manager [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-unplugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.355 186180 DEBUG nova.compute.manager [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.355 186180 DEBUG oslo_concurrency.lockutils [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.356 186180 DEBUG oslo_concurrency.lockutils [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.356 186180 DEBUG oslo_concurrency.lockutils [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.357 186180 DEBUG nova.compute.manager [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] No waiting events found dispatching network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.357 186180 WARNING nova.compute.manager [req-fb95041a-556a-4b3b-9802-3125c6dac0ea req-5aec9858-7a30-411b-8907-e5b835e00f17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received unexpected event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c for instance with vm_state active and task_state migrating.
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.504 186180 DEBUG nova.network.neutron [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Activated binding for port a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.505 186180 DEBUG nova.compute.manager [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.507 186180 DEBUG nova.virt.libvirt.vif [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:36:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-866468319',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-866468319',id=13,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:36:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-wdp6ej9p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:37:12Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=1fa1c686-a82d-4522-8330-1c9cdf431cc1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.508 186180 DEBUG nova.network.os_vif_util [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "address": "fa:16:3e:86:b9:77", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa39e1965-a5", "ovs_interfaceid": "a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.509 186180 DEBUG nova.network.os_vif_util [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:86:b9:77,bridge_name='br-int',has_traffic_filtering=True,id=a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa39e1965-a5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.509 186180 DEBUG os_vif [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:b9:77,bridge_name='br-int',has_traffic_filtering=True,id=a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa39e1965-a5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.513 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.513 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa39e1965-a5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.516 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.519 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.523 186180 INFO os_vif [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:86:b9:77,bridge_name='br-int',has_traffic_filtering=True,id=a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa39e1965-a5')
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.523 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.524 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.524 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.525 186180 DEBUG nova.compute.manager [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.525 186180 INFO nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Deleting instance files /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1_del
Feb 16 17:37:24 compute-0 nova_compute[186176]: 2026-02-16 17:37:24.526 186180 INFO nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Deletion of /var/lib/nova/instances/1fa1c686-a82d-4522-8330-1c9cdf431cc1_del complete
Feb 16 17:37:25 compute-0 podman[211425]: 2026-02-16 17:37:25.10554674 +0000 UTC m=+0.066818701 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:37:25 compute-0 podman[211424]: 2026-02-16 17:37:25.149128523 +0000 UTC m=+0.112617000 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.415 186180 DEBUG nova.compute.manager [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.415 186180 DEBUG oslo_concurrency.lockutils [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.416 186180 DEBUG oslo_concurrency.lockutils [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.416 186180 DEBUG oslo_concurrency.lockutils [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.416 186180 DEBUG nova.compute.manager [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] No waiting events found dispatching network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.416 186180 WARNING nova.compute.manager [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received unexpected event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c for instance with vm_state active and task_state migrating.
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.416 186180 DEBUG nova.compute.manager [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.417 186180 DEBUG oslo_concurrency.lockutils [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.417 186180 DEBUG oslo_concurrency.lockutils [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.417 186180 DEBUG oslo_concurrency.lockutils [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.417 186180 DEBUG nova.compute.manager [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] No waiting events found dispatching network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.418 186180 WARNING nova.compute.manager [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received unexpected event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c for instance with vm_state active and task_state migrating.
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.418 186180 DEBUG nova.compute.manager [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.418 186180 DEBUG oslo_concurrency.lockutils [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.419 186180 DEBUG oslo_concurrency.lockutils [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.419 186180 DEBUG oslo_concurrency.lockutils [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.419 186180 DEBUG nova.compute.manager [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] No waiting events found dispatching network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:37:25 compute-0 nova_compute[186176]: 2026-02-16 17:37:25.419 186180 WARNING nova.compute.manager [req-b3ef38e2-2e11-45ca-a898-cdac982cabf6 req-e684b3c3-36b3-40f2-8d4b-8cecabc05909 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Received unexpected event network-vif-plugged-a39e1965-a5ce-4015-8ad4-e3e41a7f3f4c for instance with vm_state active and task_state migrating.
Feb 16 17:37:28 compute-0 nova_compute[186176]: 2026-02-16 17:37:28.236 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:29 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 17:37:29 compute-0 systemd[211303]: Activating special unit Exit the Session...
Feb 16 17:37:29 compute-0 systemd[211303]: Stopped target Main User Target.
Feb 16 17:37:29 compute-0 systemd[211303]: Stopped target Basic System.
Feb 16 17:37:29 compute-0 systemd[211303]: Stopped target Paths.
Feb 16 17:37:29 compute-0 systemd[211303]: Stopped target Sockets.
Feb 16 17:37:29 compute-0 systemd[211303]: Stopped target Timers.
Feb 16 17:37:29 compute-0 systemd[211303]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:37:29 compute-0 systemd[211303]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 17:37:29 compute-0 systemd[211303]: Closed D-Bus User Message Bus Socket.
Feb 16 17:37:29 compute-0 systemd[211303]: Stopped Create User's Volatile Files and Directories.
Feb 16 17:37:29 compute-0 systemd[211303]: Removed slice User Application Slice.
Feb 16 17:37:29 compute-0 systemd[211303]: Reached target Shutdown.
Feb 16 17:37:29 compute-0 systemd[211303]: Finished Exit the Session.
Feb 16 17:37:29 compute-0 systemd[211303]: Reached target Exit the Session.
Feb 16 17:37:29 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 17:37:29 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 17:37:29 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 17:37:29 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 17:37:29 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 17:37:29 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 17:37:29 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 17:37:29 compute-0 nova_compute[186176]: 2026-02-16 17:37:29.517 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:29 compute-0 podman[195505]: time="2026-02-16T17:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:37:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:37:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.048 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.049 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.049 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "1fa1c686-a82d-4522-8330-1c9cdf431cc1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.088 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.089 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.089 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.090 186180 DEBUG nova.compute.resource_tracker [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.392 186180 WARNING nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.394 186180 DEBUG nova.compute.resource_tracker [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5821MB free_disk=73.2238883972168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.394 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.395 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:31 compute-0 openstack_network_exporter[198360]: ERROR   17:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:37:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:37:31 compute-0 openstack_network_exporter[198360]: ERROR   17:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:37:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.435 186180 DEBUG nova.compute.resource_tracker [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Migration for instance 1fa1c686-a82d-4522-8330-1c9cdf431cc1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.458 186180 DEBUG nova.compute.resource_tracker [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.497 186180 DEBUG nova.compute.resource_tracker [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Migration 5fa437cf-9377-4ce5-b738-9affc3b0c5c4 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.498 186180 DEBUG nova.compute.resource_tracker [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.498 186180 DEBUG nova.compute.resource_tracker [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.542 186180 DEBUG nova.compute.provider_tree [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.558 186180 DEBUG nova.scheduler.client.report [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.593 186180 DEBUG nova.compute.resource_tracker [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.594 186180 DEBUG oslo_concurrency.lockutils [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.600 186180 INFO nova.compute.manager [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.699 186180 INFO nova.scheduler.client.report [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Deleted allocation for migration 5fa437cf-9377-4ce5-b738-9affc3b0c5c4
Feb 16 17:37:31 compute-0 nova_compute[186176]: 2026-02-16 17:37:31.699 186180 DEBUG nova.virt.libvirt.driver [None req-c06cbf0e-d817-469d-88bd-0348c9abf303 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 17:37:33 compute-0 nova_compute[186176]: 2026-02-16 17:37:33.304 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:34 compute-0 nova_compute[186176]: 2026-02-16 17:37:34.520 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:38 compute-0 nova_compute[186176]: 2026-02-16 17:37:38.087 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771263443.083513, 1fa1c686-a82d-4522-8330-1c9cdf431cc1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:37:38 compute-0 nova_compute[186176]: 2026-02-16 17:37:38.088 186180 INFO nova.compute.manager [-] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] VM Stopped (Lifecycle Event)
Feb 16 17:37:38 compute-0 nova_compute[186176]: 2026-02-16 17:37:38.111 186180 DEBUG nova.compute.manager [None req-f1bb87b3-6bc7-4096-8d0f-cecbe78ef71e - - - - - -] [instance: 1fa1c686-a82d-4522-8330-1c9cdf431cc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:37:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:38.170 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:38.170 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:37:38.171 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:38 compute-0 nova_compute[186176]: 2026-02-16 17:37:38.343 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:39 compute-0 nova_compute[186176]: 2026-02-16 17:37:39.559 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:41 compute-0 nova_compute[186176]: 2026-02-16 17:37:41.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:37:41 compute-0 nova_compute[186176]: 2026-02-16 17:37:41.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:37:41 compute-0 nova_compute[186176]: 2026-02-16 17:37:41.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:37:41 compute-0 nova_compute[186176]: 2026-02-16 17:37:41.334 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:37:41 compute-0 nova_compute[186176]: 2026-02-16 17:37:41.335 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:37:43 compute-0 nova_compute[186176]: 2026-02-16 17:37:43.378 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:44 compute-0 nova_compute[186176]: 2026-02-16 17:37:44.611 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.363 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.364 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.365 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.365 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.515 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.516 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5829MB free_disk=73.2238883972168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.517 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.517 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.576 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.576 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.600 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.620 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.622 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:37:45 compute-0 nova_compute[186176]: 2026-02-16 17:37:45.622 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:37:46 compute-0 podman[211477]: 2026-02-16 17:37:46.095740243 +0000 UTC m=+0.060947216 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.7, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9/ubi-minimal, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 17:37:46 compute-0 nova_compute[186176]: 2026-02-16 17:37:46.623 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:37:47 compute-0 nova_compute[186176]: 2026-02-16 17:37:47.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:37:47 compute-0 nova_compute[186176]: 2026-02-16 17:37:47.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:37:47 compute-0 nova_compute[186176]: 2026-02-16 17:37:47.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:37:48 compute-0 podman[211499]: 2026-02-16 17:37:48.100646872 +0000 UTC m=+0.067491118 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 17:37:48 compute-0 nova_compute[186176]: 2026-02-16 17:37:48.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:37:48 compute-0 nova_compute[186176]: 2026-02-16 17:37:48.429 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:49 compute-0 nova_compute[186176]: 2026-02-16 17:37:49.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:37:49 compute-0 nova_compute[186176]: 2026-02-16 17:37:49.614 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:53 compute-0 nova_compute[186176]: 2026-02-16 17:37:53.432 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:54 compute-0 nova_compute[186176]: 2026-02-16 17:37:54.652 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:55 compute-0 nova_compute[186176]: 2026-02-16 17:37:55.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:37:56 compute-0 podman[211519]: 2026-02-16 17:37:56.100804058 +0000 UTC m=+0.061373448 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:37:56 compute-0 podman[211518]: 2026-02-16 17:37:56.117603202 +0000 UTC m=+0.083884200 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 17:37:58 compute-0 nova_compute[186176]: 2026-02-16 17:37:58.479 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:59 compute-0 nova_compute[186176]: 2026-02-16 17:37:59.662 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:37:59 compute-0 podman[195505]: time="2026-02-16T17:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:37:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:37:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Feb 16 17:38:01 compute-0 openstack_network_exporter[198360]: ERROR   17:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:38:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:38:01 compute-0 openstack_network_exporter[198360]: ERROR   17:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:38:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:38:03 compute-0 nova_compute[186176]: 2026-02-16 17:38:03.480 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:04 compute-0 nova_compute[186176]: 2026-02-16 17:38:04.701 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:08 compute-0 nova_compute[186176]: 2026-02-16 17:38:08.511 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:08 compute-0 ovn_controller[96437]: 2026-02-16T17:38:08Z|00125|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 16 17:38:09 compute-0 nova_compute[186176]: 2026-02-16 17:38:09.704 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:11 compute-0 nova_compute[186176]: 2026-02-16 17:38:11.404 186180 DEBUG nova.compute.manager [None req-be5b7882-7847-4a2c-b8b4-cfb024d5b568 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider bb904aac-529f-46ef-9861-9c655a4b383c in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 16 17:38:11 compute-0 nova_compute[186176]: 2026-02-16 17:38:11.453 186180 DEBUG nova.compute.provider_tree [None req-be5b7882-7847-4a2c-b8b4-cfb024d5b568 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Updating resource provider bb904aac-529f-46ef-9861-9c655a4b383c generation from 21 to 24 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 17:38:13 compute-0 nova_compute[186176]: 2026-02-16 17:38:13.512 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.208 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.209 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.225 186180 DEBUG nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.316 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.317 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.323 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.324 186180 INFO nova.compute.claims [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.440 186180 DEBUG nova.compute.provider_tree [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.454 186180 DEBUG nova.scheduler.client.report [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.474 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.475 186180 DEBUG nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.513 186180 DEBUG nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.513 186180 DEBUG nova.network.neutron [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.528 186180 INFO nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.542 186180 DEBUG nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.613 186180 DEBUG nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.615 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.615 186180 INFO nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Creating image(s)
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.615 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "/var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.616 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.616 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.627 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.708 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.709 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.710 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.719 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.736 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.780 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.781 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.812 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.813 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.814 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.831 186180 DEBUG nova.policy [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54934f49b2044289bcf127662fe114b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.860 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.861 186180 DEBUG nova.virt.disk.api [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Checking if we can resize image /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.862 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.907 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.908 186180 DEBUG nova.virt.disk.api [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Cannot resize image /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.908 186180 DEBUG nova.objects.instance [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'migration_context' on Instance uuid e039ddb6-babe-4626-a22d-fe3afde47f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.924 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.925 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Ensure instance console log exists: /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.925 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.926 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:14 compute-0 nova_compute[186176]: 2026-02-16 17:38:14.926 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:17 compute-0 podman[211586]: 2026-02-16 17:38:17.115512506 +0000 UTC m=+0.075594200 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, version=9.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Feb 16 17:38:17 compute-0 nova_compute[186176]: 2026-02-16 17:38:17.934 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:17.934 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:38:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:17.935 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:38:18 compute-0 nova_compute[186176]: 2026-02-16 17:38:18.040 186180 DEBUG nova.network.neutron [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Successfully created port: f4d3c9aa-91fe-4379-8354-ae62aafad774 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:38:18 compute-0 nova_compute[186176]: 2026-02-16 17:38:18.559 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:19 compute-0 podman[211608]: 2026-02-16 17:38:19.132982778 +0000 UTC m=+0.091357330 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:38:19 compute-0 nova_compute[186176]: 2026-02-16 17:38:19.740 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:20 compute-0 nova_compute[186176]: 2026-02-16 17:38:20.107 186180 DEBUG nova.network.neutron [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Successfully updated port: f4d3c9aa-91fe-4379-8354-ae62aafad774 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:38:20 compute-0 nova_compute[186176]: 2026-02-16 17:38:20.123 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:38:20 compute-0 nova_compute[186176]: 2026-02-16 17:38:20.123 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquired lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:38:20 compute-0 nova_compute[186176]: 2026-02-16 17:38:20.123 186180 DEBUG nova.network.neutron [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:38:20 compute-0 nova_compute[186176]: 2026-02-16 17:38:20.180 186180 DEBUG nova.compute.manager [req-46890a88-4868-4680-a091-a1035ff87c2c req-faf55a55-83a2-4582-92da-d9135ab9a7cc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-changed-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:38:20 compute-0 nova_compute[186176]: 2026-02-16 17:38:20.180 186180 DEBUG nova.compute.manager [req-46890a88-4868-4680-a091-a1035ff87c2c req-faf55a55-83a2-4582-92da-d9135ab9a7cc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Refreshing instance network info cache due to event network-changed-f4d3c9aa-91fe-4379-8354-ae62aafad774. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:38:20 compute-0 nova_compute[186176]: 2026-02-16 17:38:20.181 186180 DEBUG oslo_concurrency.lockutils [req-46890a88-4868-4680-a091-a1035ff87c2c req-faf55a55-83a2-4582-92da-d9135ab9a7cc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:38:20 compute-0 nova_compute[186176]: 2026-02-16 17:38:20.901 186180 DEBUG nova.network.neutron [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.071 186180 DEBUG nova.network.neutron [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Updating instance_info_cache with network_info: [{"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.090 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Releasing lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.091 186180 DEBUG nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Instance network_info: |[{"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.092 186180 DEBUG oslo_concurrency.lockutils [req-46890a88-4868-4680-a091-a1035ff87c2c req-faf55a55-83a2-4582-92da-d9135ab9a7cc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.092 186180 DEBUG nova.network.neutron [req-46890a88-4868-4680-a091-a1035ff87c2c req-faf55a55-83a2-4582-92da-d9135ab9a7cc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Refreshing network info cache for port f4d3c9aa-91fe-4379-8354-ae62aafad774 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.098 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Start _get_guest_xml network_info=[{"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.106 186180 WARNING nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.119 186180 DEBUG nova.virt.libvirt.host [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.120 186180 DEBUG nova.virt.libvirt.host [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.126 186180 DEBUG nova.virt.libvirt.host [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.127 186180 DEBUG nova.virt.libvirt.host [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.130 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.130 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.131 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.132 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.132 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.132 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.133 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.133 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.133 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.134 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.134 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.134 186180 DEBUG nova.virt.hardware [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.141 186180 DEBUG nova.virt.libvirt.vif [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:38:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1452410303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1452410303',id=15,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-t0k52ig0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:38:14Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=e039ddb6-babe-4626-a22d-fe3afde47f55,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.141 186180 DEBUG nova.network.os_vif_util [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.143 186180 DEBUG nova.network.os_vif_util [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:89:52,bridge_name='br-int',has_traffic_filtering=True,id=f4d3c9aa-91fe-4379-8354-ae62aafad774,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4d3c9aa-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.144 186180 DEBUG nova.objects.instance [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'pci_devices' on Instance uuid e039ddb6-babe-4626-a22d-fe3afde47f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.168 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <uuid>e039ddb6-babe-4626-a22d-fe3afde47f55</uuid>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <name>instance-0000000f</name>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteStrategies-server-1452410303</nova:name>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:38:22</nova:creationTime>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:38:22 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:38:22 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:38:22 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:38:22 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:38:22 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:38:22 compute-0 nova_compute[186176]:         <nova:user uuid="c54934f49b2044289bcf127662fe114b">tempest-TestExecuteStrategies-1098930400-project-member</nova:user>
Feb 16 17:38:22 compute-0 nova_compute[186176]:         <nova:project uuid="1a237c4b00c5426cb1dc6afe3c7c868c">tempest-TestExecuteStrategies-1098930400</nova:project>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:38:22 compute-0 nova_compute[186176]:         <nova:port uuid="f4d3c9aa-91fe-4379-8354-ae62aafad774">
Feb 16 17:38:22 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <system>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <entry name="serial">e039ddb6-babe-4626-a22d-fe3afde47f55</entry>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <entry name="uuid">e039ddb6-babe-4626-a22d-fe3afde47f55</entry>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     </system>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <os>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   </os>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <features>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   </features>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk.config"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:d3:89:52"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <target dev="tapf4d3c9aa-91"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/console.log" append="off"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <video>
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     </video>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:38:22 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:38:22 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:38:22 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:38:22 compute-0 nova_compute[186176]: </domain>
Feb 16 17:38:22 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.169 186180 DEBUG nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Preparing to wait for external event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.170 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.170 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.171 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.172 186180 DEBUG nova.virt.libvirt.vif [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:38:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1452410303',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1452410303',id=15,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-t0k52ig0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:38:14Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=e039ddb6-babe-4626-a22d-fe3afde47f55,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.172 186180 DEBUG nova.network.os_vif_util [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.173 186180 DEBUG nova.network.os_vif_util [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:89:52,bridge_name='br-int',has_traffic_filtering=True,id=f4d3c9aa-91fe-4379-8354-ae62aafad774,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4d3c9aa-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.174 186180 DEBUG os_vif [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:89:52,bridge_name='br-int',has_traffic_filtering=True,id=f4d3c9aa-91fe-4379-8354-ae62aafad774,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4d3c9aa-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.175 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.176 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.176 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.182 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.183 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4d3c9aa-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.184 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf4d3c9aa-91, col_values=(('external_ids', {'iface-id': 'f4d3c9aa-91fe-4379-8354-ae62aafad774', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:89:52', 'vm-uuid': 'e039ddb6-babe-4626-a22d-fe3afde47f55'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:38:22 compute-0 NetworkManager[56463]: <info>  [1771263502.2173] manager: (tapf4d3c9aa-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.216 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.221 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.223 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.224 186180 INFO os_vif [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:89:52,bridge_name='br-int',has_traffic_filtering=True,id=f4d3c9aa-91fe-4379-8354-ae62aafad774,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4d3c9aa-91')
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.289 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.290 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.290 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No VIF found with MAC fa:16:3e:d3:89:52, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:38:22 compute-0 nova_compute[186176]: 2026-02-16 17:38:22.291 186180 INFO nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Using config drive
Feb 16 17:38:23 compute-0 nova_compute[186176]: 2026-02-16 17:38:23.054 186180 INFO nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Creating config drive at /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk.config
Feb 16 17:38:23 compute-0 nova_compute[186176]: 2026-02-16 17:38:23.061 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjfehkodz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:38:23 compute-0 nova_compute[186176]: 2026-02-16 17:38:23.187 186180 DEBUG oslo_concurrency.processutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpjfehkodz" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:38:23 compute-0 kernel: tapf4d3c9aa-91: entered promiscuous mode
Feb 16 17:38:23 compute-0 NetworkManager[56463]: <info>  [1771263503.2636] manager: (tapf4d3c9aa-91): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Feb 16 17:38:23 compute-0 ovn_controller[96437]: 2026-02-16T17:38:23Z|00126|binding|INFO|Claiming lport f4d3c9aa-91fe-4379-8354-ae62aafad774 for this chassis.
Feb 16 17:38:23 compute-0 ovn_controller[96437]: 2026-02-16T17:38:23Z|00127|binding|INFO|f4d3c9aa-91fe-4379-8354-ae62aafad774: Claiming fa:16:3e:d3:89:52 10.100.0.12
Feb 16 17:38:23 compute-0 nova_compute[186176]: 2026-02-16 17:38:23.306 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:23 compute-0 ovn_controller[96437]: 2026-02-16T17:38:23Z|00128|binding|INFO|Setting lport f4d3c9aa-91fe-4379-8354-ae62aafad774 ovn-installed in OVS
Feb 16 17:38:23 compute-0 ovn_controller[96437]: 2026-02-16T17:38:23Z|00129|binding|INFO|Setting lport f4d3c9aa-91fe-4379-8354-ae62aafad774 up in Southbound
Feb 16 17:38:23 compute-0 nova_compute[186176]: 2026-02-16 17:38:23.311 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.314 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:89:52 10.100.0.12'], port_security=['fa:16:3e:d3:89:52 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e039ddb6-babe-4626-a22d-fe3afde47f55', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=f4d3c9aa-91fe-4379-8354-ae62aafad774) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.316 105730 INFO neutron.agent.ovn.metadata.agent [-] Port f4d3c9aa-91fe-4379-8354-ae62aafad774 in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 bound to our chassis
Feb 16 17:38:23 compute-0 nova_compute[186176]: 2026-02-16 17:38:23.317 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.318 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:38:23 compute-0 systemd-udevd[211647]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.330 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[dc9f13b6-335d-4ab3-b2fc-c2f0df14ad1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.331 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap94cafcd0-c1 in ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.335 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap94cafcd0-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.335 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[71724c1f-40c5-4f00-b54f-80eb10608353]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.336 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f3fa638a-7f10-49bb-b926-d6b23b0e48a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 systemd-machined[155631]: New machine qemu-12-instance-0000000f.
Feb 16 17:38:23 compute-0 NetworkManager[56463]: <info>  [1771263503.3454] device (tapf4d3c9aa-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:38:23 compute-0 NetworkManager[56463]: <info>  [1771263503.3459] device (tapf4d3c9aa-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.348 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[6e431439-4762-4a78-b1e0-379b80d2156c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000f.
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.370 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ae2494ed-ff26-43ee-9d59-f0a037b24cd1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.403 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[4ca370c0-c44e-46e9-ac54-db83df450482]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.412 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b5f43492-6d83-4398-b56c-44b138a9727a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 NetworkManager[56463]: <info>  [1771263503.4138] manager: (tap94cafcd0-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.451 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[25115ae7-0d60-4794-b6f2-87f0be76628c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.455 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[3169e3e2-c972-48ec-8b48-ee0fda5f5ba7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 NetworkManager[56463]: <info>  [1771263503.4793] device (tap94cafcd0-c0): carrier: link connected
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.483 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa99f58-4bfc-4a1b-96b2-435670d19679]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.501 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca66dd1-7a5c-48ea-a737-41693246adae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509900, 'reachable_time': 32150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211680, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.514 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e624ce96-23bf-4e46-b50f-c16cef6c575f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 509900, 'tstamp': 509900}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211681, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.531 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[03ed4c8b-2474-4dfd-80a0-8bf233acf200]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509900, 'reachable_time': 32150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211682, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.556 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[c47ed5fc-b412-4a38-ae4c-243ff3011fbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 nova_compute[186176]: 2026-02-16 17:38:23.564 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.625 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[bea063f3-c3fe-4687-bd02-1b6198104a0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.626 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.627 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.627 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:38:23 compute-0 nova_compute[186176]: 2026-02-16 17:38:23.629 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:23 compute-0 NetworkManager[56463]: <info>  [1771263503.6301] manager: (tap94cafcd0-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Feb 16 17:38:23 compute-0 kernel: tap94cafcd0-c0: entered promiscuous mode
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.633 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:38:23 compute-0 nova_compute[186176]: 2026-02-16 17:38:23.634 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:23 compute-0 ovn_controller[96437]: 2026-02-16T17:38:23Z|00130|binding|INFO|Releasing lport 5c28d585-b48c-40c6-b5e7-f1e59317b2de from this chassis (sb_readonly=0)
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.635 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.636 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[2fde64fa-0d6b-4b8d-b79b-f02afa0c9872]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.637 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:38:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:23.638 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'env', 'PROCESS_TAG=haproxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:38:23 compute-0 nova_compute[186176]: 2026-02-16 17:38:23.639 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:23 compute-0 podman[211714]: 2026-02-16 17:38:23.990590784 +0000 UTC m=+0.049016901 container create d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:38:24 compute-0 systemd[1]: Started libpod-conmon-d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862.scope.
Feb 16 17:38:24 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:38:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/913f3fda316971555d6dff63a7f3f35d38baa4e4efef44ed32d245ee62ed07d3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:38:24 compute-0 podman[211714]: 2026-02-16 17:38:23.965107031 +0000 UTC m=+0.023533158 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:38:24 compute-0 podman[211714]: 2026-02-16 17:38:24.062172146 +0000 UTC m=+0.120598273 container init d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:38:24 compute-0 podman[211714]: 2026-02-16 17:38:24.06813532 +0000 UTC m=+0.126561427 container start d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 16 17:38:24 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211732]: [NOTICE]   (211741) : New worker (211743) forked
Feb 16 17:38:24 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211732]: [NOTICE]   (211741) : Loading success.
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.086 186180 DEBUG nova.network.neutron [req-46890a88-4868-4680-a091-a1035ff87c2c req-faf55a55-83a2-4582-92da-d9135ab9a7cc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Updated VIF entry in instance network info cache for port f4d3c9aa-91fe-4379-8354-ae62aafad774. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.088 186180 DEBUG nova.network.neutron [req-46890a88-4868-4680-a091-a1035ff87c2c req-faf55a55-83a2-4582-92da-d9135ab9a7cc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Updating instance_info_cache with network_info: [{"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.093 186180 DEBUG nova.compute.manager [req-f1f3d590-75fa-455b-ae28-2efa192ad3db req-d159e768-7320-4945-8d7e-d610b299ffdd 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.094 186180 DEBUG oslo_concurrency.lockutils [req-f1f3d590-75fa-455b-ae28-2efa192ad3db req-d159e768-7320-4945-8d7e-d610b299ffdd 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.094 186180 DEBUG oslo_concurrency.lockutils [req-f1f3d590-75fa-455b-ae28-2efa192ad3db req-d159e768-7320-4945-8d7e-d610b299ffdd 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.095 186180 DEBUG oslo_concurrency.lockutils [req-f1f3d590-75fa-455b-ae28-2efa192ad3db req-d159e768-7320-4945-8d7e-d610b299ffdd 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.096 186180 DEBUG nova.compute.manager [req-f1f3d590-75fa-455b-ae28-2efa192ad3db req-d159e768-7320-4945-8d7e-d610b299ffdd 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Processing event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.103 186180 DEBUG oslo_concurrency.lockutils [req-46890a88-4868-4680-a091-a1035ff87c2c req-faf55a55-83a2-4582-92da-d9135ab9a7cc 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.106 186180 DEBUG nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.107 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263504.1063704, e039ddb6-babe-4626-a22d-fe3afde47f55 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.108 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] VM Started (Lifecycle Event)
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.111 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.117 186180 INFO nova.virt.libvirt.driver [-] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Instance spawned successfully.
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.117 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.129 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.138 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.143 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.144 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.145 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.145 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.146 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.147 186180 DEBUG nova.virt.libvirt.driver [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.170 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.170 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263504.107363, e039ddb6-babe-4626-a22d-fe3afde47f55 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.171 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] VM Paused (Lifecycle Event)
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.204 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.208 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263504.110943, e039ddb6-babe-4626-a22d-fe3afde47f55 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.209 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] VM Resumed (Lifecycle Event)
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.214 186180 INFO nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Took 9.60 seconds to spawn the instance on the hypervisor.
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.214 186180 DEBUG nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.225 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.229 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.253 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.274 186180 INFO nova.compute.manager [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Took 9.98 seconds to build instance.
Feb 16 17:38:24 compute-0 nova_compute[186176]: 2026-02-16 17:38:24.293 186180 DEBUG oslo_concurrency.lockutils [None req-8950bac9-c8ad-44e1-86c1-2180c54412be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:26 compute-0 nova_compute[186176]: 2026-02-16 17:38:26.200 186180 DEBUG nova.compute.manager [req-ec387872-830d-4a81-a132-f60b7f30f09d req-8929085a-cd5e-4b33-984e-1f2cfc2bc499 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:38:26 compute-0 nova_compute[186176]: 2026-02-16 17:38:26.201 186180 DEBUG oslo_concurrency.lockutils [req-ec387872-830d-4a81-a132-f60b7f30f09d req-8929085a-cd5e-4b33-984e-1f2cfc2bc499 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:26 compute-0 nova_compute[186176]: 2026-02-16 17:38:26.201 186180 DEBUG oslo_concurrency.lockutils [req-ec387872-830d-4a81-a132-f60b7f30f09d req-8929085a-cd5e-4b33-984e-1f2cfc2bc499 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:26 compute-0 nova_compute[186176]: 2026-02-16 17:38:26.202 186180 DEBUG oslo_concurrency.lockutils [req-ec387872-830d-4a81-a132-f60b7f30f09d req-8929085a-cd5e-4b33-984e-1f2cfc2bc499 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:26 compute-0 nova_compute[186176]: 2026-02-16 17:38:26.202 186180 DEBUG nova.compute.manager [req-ec387872-830d-4a81-a132-f60b7f30f09d req-8929085a-cd5e-4b33-984e-1f2cfc2bc499 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] No waiting events found dispatching network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:38:26 compute-0 nova_compute[186176]: 2026-02-16 17:38:26.203 186180 WARNING nova.compute.manager [req-ec387872-830d-4a81-a132-f60b7f30f09d req-8929085a-cd5e-4b33-984e-1f2cfc2bc499 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received unexpected event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 for instance with vm_state active and task_state None.
Feb 16 17:38:27 compute-0 podman[211753]: 2026-02-16 17:38:27.123991214 +0000 UTC m=+0.078751986 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:38:27 compute-0 podman[211752]: 2026-02-16 17:38:27.141645709 +0000 UTC m=+0.096887202 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 17:38:27 compute-0 nova_compute[186176]: 2026-02-16 17:38:27.216 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:27 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:27.939 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:38:28 compute-0 nova_compute[186176]: 2026-02-16 17:38:28.565 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:29 compute-0 podman[195505]: time="2026-02-16T17:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:38:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:38:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2633 "" "Go-http-client/1.1"
Feb 16 17:38:31 compute-0 openstack_network_exporter[198360]: ERROR   17:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:38:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:38:31 compute-0 openstack_network_exporter[198360]: ERROR   17:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:38:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:38:32 compute-0 nova_compute[186176]: 2026-02-16 17:38:32.218 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:33 compute-0 nova_compute[186176]: 2026-02-16 17:38:33.594 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:35 compute-0 ovn_controller[96437]: 2026-02-16T17:38:35Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:89:52 10.100.0.12
Feb 16 17:38:35 compute-0 ovn_controller[96437]: 2026-02-16T17:38:35Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:89:52 10.100.0.12
Feb 16 17:38:37 compute-0 nova_compute[186176]: 2026-02-16 17:38:37.221 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:38.171 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:38.171 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:38:38.172 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:38 compute-0 nova_compute[186176]: 2026-02-16 17:38:38.598 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:42 compute-0 nova_compute[186176]: 2026-02-16 17:38:42.270 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:42 compute-0 nova_compute[186176]: 2026-02-16 17:38:42.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:38:42 compute-0 nova_compute[186176]: 2026-02-16 17:38:42.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:38:42 compute-0 nova_compute[186176]: 2026-02-16 17:38:42.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:38:42 compute-0 nova_compute[186176]: 2026-02-16 17:38:42.497 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:38:42 compute-0 nova_compute[186176]: 2026-02-16 17:38:42.498 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:38:42 compute-0 nova_compute[186176]: 2026-02-16 17:38:42.498 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:38:42 compute-0 nova_compute[186176]: 2026-02-16 17:38:42.498 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid e039ddb6-babe-4626-a22d-fe3afde47f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:38:43 compute-0 nova_compute[186176]: 2026-02-16 17:38:43.599 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:44 compute-0 nova_compute[186176]: 2026-02-16 17:38:44.093 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Updating instance_info_cache with network_info: [{"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:38:44 compute-0 nova_compute[186176]: 2026-02-16 17:38:44.114 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:38:44 compute-0 nova_compute[186176]: 2026-02-16 17:38:44.114 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:38:44 compute-0 nova_compute[186176]: 2026-02-16 17:38:44.115 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.335 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.357 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.358 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.358 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.359 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.418 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.465 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.466 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.510 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.626 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.627 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5657MB free_disk=73.19499588012695GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.627 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.628 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.686 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance e039ddb6-babe-4626-a22d-fe3afde47f55 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.686 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.687 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.722 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.735 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.758 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:38:46 compute-0 nova_compute[186176]: 2026-02-16 17:38:46.759 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:38:47 compute-0 nova_compute[186176]: 2026-02-16 17:38:47.271 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:47 compute-0 nova_compute[186176]: 2026-02-16 17:38:47.741 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:38:48 compute-0 podman[211832]: 2026-02-16 17:38:48.107299308 +0000 UTC m=+0.068368006 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9/ubi-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, vcs-type=git, version=9.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Feb 16 17:38:48 compute-0 nova_compute[186176]: 2026-02-16 17:38:48.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:38:48 compute-0 nova_compute[186176]: 2026-02-16 17:38:48.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:38:48 compute-0 nova_compute[186176]: 2026-02-16 17:38:48.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:38:48 compute-0 nova_compute[186176]: 2026-02-16 17:38:48.600 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:49 compute-0 nova_compute[186176]: 2026-02-16 17:38:49.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:38:50 compute-0 podman[211855]: 2026-02-16 17:38:50.105936675 +0000 UTC m=+0.063290183 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 16 17:38:50 compute-0 nova_compute[186176]: 2026-02-16 17:38:50.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:38:52 compute-0 nova_compute[186176]: 2026-02-16 17:38:52.316 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:53 compute-0 nova_compute[186176]: 2026-02-16 17:38:53.602 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:56 compute-0 nova_compute[186176]: 2026-02-16 17:38:56.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:38:57 compute-0 nova_compute[186176]: 2026-02-16 17:38:57.323 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:58 compute-0 podman[211875]: 2026-02-16 17:38:58.106625752 +0000 UTC m=+0.063663792 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:38:58 compute-0 podman[211874]: 2026-02-16 17:38:58.145144919 +0000 UTC m=+0.110179562 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Feb 16 17:38:58 compute-0 nova_compute[186176]: 2026-02-16 17:38:58.604 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:38:59 compute-0 podman[195505]: time="2026-02-16T17:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:38:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:38:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2637 "" "Go-http-client/1.1"
Feb 16 17:39:01 compute-0 openstack_network_exporter[198360]: ERROR   17:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:39:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:39:01 compute-0 openstack_network_exporter[198360]: ERROR   17:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:39:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:39:02 compute-0 nova_compute[186176]: 2026-02-16 17:39:02.329 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:02 compute-0 ovn_controller[96437]: 2026-02-16T17:39:02Z|00131|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 17:39:03 compute-0 nova_compute[186176]: 2026-02-16 17:39:03.645 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:07 compute-0 nova_compute[186176]: 2026-02-16 17:39:07.333 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:08 compute-0 nova_compute[186176]: 2026-02-16 17:39:08.647 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:12 compute-0 nova_compute[186176]: 2026-02-16 17:39:12.336 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:13 compute-0 nova_compute[186176]: 2026-02-16 17:39:13.649 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:17 compute-0 nova_compute[186176]: 2026-02-16 17:39:17.340 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:18 compute-0 nova_compute[186176]: 2026-02-16 17:39:18.652 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:19 compute-0 podman[211925]: 2026-02-16 17:39:19.099409464 +0000 UTC m=+0.065734613 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., release=1770267347, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7)
Feb 16 17:39:21 compute-0 podman[211946]: 2026-02-16 17:39:21.087091938 +0000 UTC m=+0.060450715 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Feb 16 17:39:21 compute-0 nova_compute[186176]: 2026-02-16 17:39:21.424 186180 DEBUG nova.compute.manager [None req-776cd644-b4ab-4514-99db-99cff2e1d799 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider bb904aac-529f-46ef-9861-9c655a4b383c in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:610
Feb 16 17:39:21 compute-0 nova_compute[186176]: 2026-02-16 17:39:21.471 186180 DEBUG nova.compute.provider_tree [None req-776cd644-b4ab-4514-99db-99cff2e1d799 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Updating resource provider bb904aac-529f-46ef-9861-9c655a4b383c generation from 24 to 26 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 17:39:22 compute-0 nova_compute[186176]: 2026-02-16 17:39:22.342 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:23 compute-0 nova_compute[186176]: 2026-02-16 17:39:23.653 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:25 compute-0 nova_compute[186176]: 2026-02-16 17:39:25.562 186180 DEBUG nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Check if temp file /var/lib/nova/instances/tmpfdhu_2we exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 17:39:25 compute-0 nova_compute[186176]: 2026-02-16 17:39:25.563 186180 DEBUG nova.compute.manager [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfdhu_2we',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e039ddb6-babe-4626-a22d-fe3afde47f55',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 17:39:26 compute-0 nova_compute[186176]: 2026-02-16 17:39:26.281 186180 DEBUG oslo_concurrency.processutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:39:26 compute-0 nova_compute[186176]: 2026-02-16 17:39:26.351 186180 DEBUG oslo_concurrency.processutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:39:26 compute-0 nova_compute[186176]: 2026-02-16 17:39:26.352 186180 DEBUG oslo_concurrency.processutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:39:26 compute-0 nova_compute[186176]: 2026-02-16 17:39:26.411 186180 DEBUG oslo_concurrency.processutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:39:27 compute-0 nova_compute[186176]: 2026-02-16 17:39:27.345 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:28 compute-0 sshd-session[211972]: Accepted publickey for nova from 192.168.122.101 port 44014 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:39:28 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 17:39:28 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 17:39:28 compute-0 systemd-logind[821]: New session 37 of user nova.
Feb 16 17:39:28 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 17:39:28 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 17:39:28 compute-0 systemd[211976]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:39:28 compute-0 systemd[211976]: Queued start job for default target Main User Target.
Feb 16 17:39:28 compute-0 systemd[211976]: Created slice User Application Slice.
Feb 16 17:39:28 compute-0 systemd[211976]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:39:28 compute-0 systemd[211976]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 17:39:28 compute-0 systemd[211976]: Reached target Paths.
Feb 16 17:39:28 compute-0 systemd[211976]: Reached target Timers.
Feb 16 17:39:28 compute-0 systemd[211976]: Starting D-Bus User Message Bus Socket...
Feb 16 17:39:28 compute-0 systemd[211976]: Starting Create User's Volatile Files and Directories...
Feb 16 17:39:28 compute-0 systemd[211976]: Finished Create User's Volatile Files and Directories.
Feb 16 17:39:28 compute-0 systemd[211976]: Listening on D-Bus User Message Bus Socket.
Feb 16 17:39:28 compute-0 systemd[211976]: Reached target Sockets.
Feb 16 17:39:28 compute-0 systemd[211976]: Reached target Basic System.
Feb 16 17:39:28 compute-0 systemd[211976]: Reached target Main User Target.
Feb 16 17:39:28 compute-0 systemd[211976]: Startup finished in 133ms.
Feb 16 17:39:28 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 17:39:28 compute-0 systemd[1]: Started Session 37 of User nova.
Feb 16 17:39:28 compute-0 sshd-session[211972]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:39:28 compute-0 sshd-session[212006]: Received disconnect from 192.168.122.101 port 44014:11: disconnected by user
Feb 16 17:39:28 compute-0 sshd-session[212006]: Disconnected from user nova 192.168.122.101 port 44014
Feb 16 17:39:28 compute-0 sshd-session[211972]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:39:28 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Feb 16 17:39:28 compute-0 systemd-logind[821]: Session 37 logged out. Waiting for processes to exit.
Feb 16 17:39:28 compute-0 podman[211992]: 2026-02-16 17:39:28.326865029 +0000 UTC m=+0.089632198 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:39:28 compute-0 systemd-logind[821]: Removed session 37.
Feb 16 17:39:28 compute-0 podman[211991]: 2026-02-16 17:39:28.33273322 +0000 UTC m=+0.096661947 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:39:28 compute-0 nova_compute[186176]: 2026-02-16 17:39:28.689 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:29.339 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:39:29 compute-0 nova_compute[186176]: 2026-02-16 17:39:29.339 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:29 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:29.341 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:39:29 compute-0 nova_compute[186176]: 2026-02-16 17:39:29.353 186180 DEBUG nova.compute.manager [req-bd4fff7a-6449-4466-a6d5-390e604898f9 req-a7cdb2a7-39a3-45a7-a0ed-7bdcbe90c7b8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-unplugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:39:29 compute-0 nova_compute[186176]: 2026-02-16 17:39:29.354 186180 DEBUG oslo_concurrency.lockutils [req-bd4fff7a-6449-4466-a6d5-390e604898f9 req-a7cdb2a7-39a3-45a7-a0ed-7bdcbe90c7b8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:29 compute-0 nova_compute[186176]: 2026-02-16 17:39:29.355 186180 DEBUG oslo_concurrency.lockutils [req-bd4fff7a-6449-4466-a6d5-390e604898f9 req-a7cdb2a7-39a3-45a7-a0ed-7bdcbe90c7b8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:29 compute-0 nova_compute[186176]: 2026-02-16 17:39:29.355 186180 DEBUG oslo_concurrency.lockutils [req-bd4fff7a-6449-4466-a6d5-390e604898f9 req-a7cdb2a7-39a3-45a7-a0ed-7bdcbe90c7b8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:29 compute-0 nova_compute[186176]: 2026-02-16 17:39:29.355 186180 DEBUG nova.compute.manager [req-bd4fff7a-6449-4466-a6d5-390e604898f9 req-a7cdb2a7-39a3-45a7-a0ed-7bdcbe90c7b8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] No waiting events found dispatching network-vif-unplugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:39:29 compute-0 nova_compute[186176]: 2026-02-16 17:39:29.356 186180 DEBUG nova.compute.manager [req-bd4fff7a-6449-4466-a6d5-390e604898f9 req-a7cdb2a7-39a3-45a7-a0ed-7bdcbe90c7b8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-unplugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:39:29 compute-0 podman[195505]: time="2026-02-16T17:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:39:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:39:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Feb 16 17:39:30 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:30.343 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.070 186180 INFO nova.compute.manager [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Took 4.66 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.071 186180 DEBUG nova.compute.manager [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.149 186180 DEBUG nova.compute.manager [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfdhu_2we',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e039ddb6-babe-4626-a22d-fe3afde47f55',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(ba040bbe-1df3-482d-9f10-650dad157f17),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.192 186180 DEBUG nova.objects.instance [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid e039ddb6-babe-4626-a22d-fe3afde47f55 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.194 186180 DEBUG nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.196 186180 DEBUG nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.196 186180 DEBUG nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.301 186180 DEBUG nova.virt.libvirt.vif [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:38:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1452410303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1452410303',id=15,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:38:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-t0k52ig0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:38:24Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=e039ddb6-babe-4626-a22d-fe3afde47f55,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.301 186180 DEBUG nova.network.os_vif_util [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.303 186180 DEBUG nova.network.os_vif_util [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:89:52,bridge_name='br-int',has_traffic_filtering=True,id=f4d3c9aa-91fe-4379-8354-ae62aafad774,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4d3c9aa-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.304 186180 DEBUG nova.virt.libvirt.migration [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 17:39:31 compute-0 nova_compute[186176]:   <mac address="fa:16:3e:d3:89:52"/>
Feb 16 17:39:31 compute-0 nova_compute[186176]:   <model type="virtio"/>
Feb 16 17:39:31 compute-0 nova_compute[186176]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:39:31 compute-0 nova_compute[186176]:   <mtu size="1442"/>
Feb 16 17:39:31 compute-0 nova_compute[186176]:   <target dev="tapf4d3c9aa-91"/>
Feb 16 17:39:31 compute-0 nova_compute[186176]: </interface>
Feb 16 17:39:31 compute-0 nova_compute[186176]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.305 186180 DEBUG nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 17:39:31 compute-0 openstack_network_exporter[198360]: ERROR   17:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:39:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:39:31 compute-0 openstack_network_exporter[198360]: ERROR   17:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:39:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.459 186180 DEBUG nova.compute.manager [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.460 186180 DEBUG oslo_concurrency.lockutils [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.460 186180 DEBUG oslo_concurrency.lockutils [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.461 186180 DEBUG oslo_concurrency.lockutils [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.461 186180 DEBUG nova.compute.manager [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] No waiting events found dispatching network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.461 186180 WARNING nova.compute.manager [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received unexpected event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 for instance with vm_state active and task_state migrating.
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.462 186180 DEBUG nova.compute.manager [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-changed-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.462 186180 DEBUG nova.compute.manager [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Refreshing instance network info cache due to event network-changed-f4d3c9aa-91fe-4379-8354-ae62aafad774. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.463 186180 DEBUG oslo_concurrency.lockutils [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.463 186180 DEBUG oslo_concurrency.lockutils [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.463 186180 DEBUG nova.network.neutron [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Refreshing network info cache for port f4d3c9aa-91fe-4379-8354-ae62aafad774 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.698 186180 DEBUG nova.virt.libvirt.migration [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.699 186180 INFO nova.virt.libvirt.migration [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 17:39:31 compute-0 nova_compute[186176]: 2026-02-16 17:39:31.819 186180 INFO nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.323 186180 DEBUG nova.virt.libvirt.migration [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.323 186180 DEBUG nova.virt.libvirt.migration [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.348 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.828 186180 DEBUG nova.virt.libvirt.migration [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.829 186180 DEBUG nova.virt.libvirt.migration [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.949 186180 DEBUG nova.network.neutron [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Updated VIF entry in instance network info cache for port f4d3c9aa-91fe-4379-8354-ae62aafad774. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.950 186180 DEBUG nova.network.neutron [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Updating instance_info_cache with network_info: [{"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.954 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263572.9530284, e039ddb6-babe-4626-a22d-fe3afde47f55 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.954 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] VM Paused (Lifecycle Event)
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.978 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.979 186180 DEBUG oslo_concurrency.lockutils [req-923cd7a1-6c5c-458c-9f3f-8de11cf807fa req-04633d15-9073-4dd0-a6c8-bbde5e59e468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-e039ddb6-babe-4626-a22d-fe3afde47f55" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:39:32 compute-0 nova_compute[186176]: 2026-02-16 17:39:32.985 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.014 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 17:39:33 compute-0 kernel: tapf4d3c9aa-91 (unregistering): left promiscuous mode
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.074 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:33 compute-0 NetworkManager[56463]: <info>  [1771263573.0761] device (tapf4d3c9aa-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:39:33 compute-0 ovn_controller[96437]: 2026-02-16T17:39:33Z|00132|binding|INFO|Releasing lport f4d3c9aa-91fe-4379-8354-ae62aafad774 from this chassis (sb_readonly=0)
Feb 16 17:39:33 compute-0 ovn_controller[96437]: 2026-02-16T17:39:33Z|00133|binding|INFO|Setting lport f4d3c9aa-91fe-4379-8354-ae62aafad774 down in Southbound
Feb 16 17:39:33 compute-0 ovn_controller[96437]: 2026-02-16T17:39:33Z|00134|binding|INFO|Removing iface tapf4d3c9aa-91 ovn-installed in OVS
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.082 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.090 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:89:52 10.100.0.12'], port_security=['fa:16:3e:d3:89:52 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'e039ddb6-babe-4626-a22d-fe3afde47f55', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=f4d3c9aa-91fe-4379-8354-ae62aafad774) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.091 105730 INFO neutron.agent.ovn.metadata.agent [-] Port f4d3c9aa-91fe-4379-8354-ae62aafad774 in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.093 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.093 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.095 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[346278d7-77dc-44f6-9346-4d4a4625f756]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.098 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace which is not needed anymore
Feb 16 17:39:33 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Feb 16 17:39:33 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000f.scope: Consumed 14.455s CPU time.
Feb 16 17:39:33 compute-0 systemd-machined[155631]: Machine qemu-12-instance-0000000f terminated.
Feb 16 17:39:33 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211732]: [NOTICE]   (211741) : haproxy version is 2.8.14-c23fe91
Feb 16 17:39:33 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211732]: [NOTICE]   (211741) : path to executable is /usr/sbin/haproxy
Feb 16 17:39:33 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211732]: [WARNING]  (211741) : Exiting Master process...
Feb 16 17:39:33 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211732]: [ALERT]    (211741) : Current worker (211743) exited with code 143 (Terminated)
Feb 16 17:39:33 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[211732]: [WARNING]  (211741) : All workers exited. Exiting... (0)
Feb 16 17:39:33 compute-0 systemd[1]: libpod-d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862.scope: Deactivated successfully.
Feb 16 17:39:33 compute-0 podman[212086]: 2026-02-16 17:39:33.238266268 +0000 UTC m=+0.047487274 container died d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:39:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862-userdata-shm.mount: Deactivated successfully.
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.275 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-913f3fda316971555d6dff63a7f3f35d38baa4e4efef44ed32d245ee62ed07d3-merged.mount: Deactivated successfully.
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.281 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:33 compute-0 podman[212086]: 2026-02-16 17:39:33.290679649 +0000 UTC m=+0.099900685 container cleanup d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 16 17:39:33 compute-0 systemd[1]: libpod-conmon-d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862.scope: Deactivated successfully.
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.325 186180 DEBUG nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.326 186180 DEBUG nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.326 186180 DEBUG nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.333 186180 DEBUG nova.virt.libvirt.guest [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'e039ddb6-babe-4626-a22d-fe3afde47f55' (instance-0000000f) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.333 186180 INFO nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Migration operation has completed
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.333 186180 INFO nova.compute.manager [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] _post_live_migration() is started..
Feb 16 17:39:33 compute-0 podman[212130]: 2026-02-16 17:39:33.36676855 +0000 UTC m=+0.050565528 container remove d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.371 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f91e96a8-ea53-4bf6-9b71-4312673ce479]: (4, ('Mon Feb 16 05:39:33 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862)\nd3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862\nMon Feb 16 05:39:33 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (d3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862)\nd3b1b2a7a9e3691760761b831ba4cdbb86869c1e3bccbceb0404af73e1505862\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.373 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[782225db-5be9-4799-9c99-4fdc41535e01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.375 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.378 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:33 compute-0 kernel: tap94cafcd0-c0: left promiscuous mode
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.390 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.394 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[362549da-960e-4b64-9ef8-b6c3f11679bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.411 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[98dac0b5-66f1-4fed-ac11-48cabb95d91d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.413 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[fad88d9f-f5b7-48fa-8eaf-06273fe9a14a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.429 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[78f5d58b-7da5-4d1a-859a-676682f79e89]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 509892, 'reachable_time': 37777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212151, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.432 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:39:33 compute-0 systemd[1]: run-netns-ovnmeta\x2d94cafcd0\x2dc7c2\x2d48b4\x2da2dd\x2d21c16ce48dc4.mount: Deactivated successfully.
Feb 16 17:39:33 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:33.432 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[ca634d01-9f59-4d90-9710-8531ad1b3fea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.537 186180 DEBUG nova.compute.manager [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-unplugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.537 186180 DEBUG oslo_concurrency.lockutils [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.538 186180 DEBUG oslo_concurrency.lockutils [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.538 186180 DEBUG oslo_concurrency.lockutils [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.538 186180 DEBUG nova.compute.manager [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] No waiting events found dispatching network-vif-unplugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.538 186180 DEBUG nova.compute.manager [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-unplugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.538 186180 DEBUG nova.compute.manager [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.538 186180 DEBUG oslo_concurrency.lockutils [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.539 186180 DEBUG oslo_concurrency.lockutils [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.539 186180 DEBUG oslo_concurrency.lockutils [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.539 186180 DEBUG nova.compute.manager [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] No waiting events found dispatching network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.539 186180 WARNING nova.compute.manager [req-3a2f52dc-7d32-4743-a175-c0d790c8a920 req-ce075b64-5330-4d3b-b29a-2b6cb8294784 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received unexpected event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 for instance with vm_state active and task_state migrating.
Feb 16 17:39:33 compute-0 nova_compute[186176]: 2026-02-16 17:39:33.692 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.188 186180 DEBUG nova.network.neutron [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Activated binding for port f4d3c9aa-91fe-4379-8354-ae62aafad774 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.189 186180 DEBUG nova.compute.manager [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.190 186180 DEBUG nova.virt.libvirt.vif [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:38:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1452410303',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1452410303',id=15,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:38:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-t0k52ig0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:39:22Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=e039ddb6-babe-4626-a22d-fe3afde47f55,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.191 186180 DEBUG nova.network.os_vif_util [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "address": "fa:16:3e:d3:89:52", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf4d3c9aa-91", "ovs_interfaceid": "f4d3c9aa-91fe-4379-8354-ae62aafad774", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.192 186180 DEBUG nova.network.os_vif_util [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:89:52,bridge_name='br-int',has_traffic_filtering=True,id=f4d3c9aa-91fe-4379-8354-ae62aafad774,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4d3c9aa-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.193 186180 DEBUG os_vif [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:89:52,bridge_name='br-int',has_traffic_filtering=True,id=f4d3c9aa-91fe-4379-8354-ae62aafad774,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4d3c9aa-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.195 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.195 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4d3c9aa-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.198 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.200 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.203 186180 INFO os_vif [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:89:52,bridge_name='br-int',has_traffic_filtering=True,id=f4d3c9aa-91fe-4379-8354-ae62aafad774,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf4d3c9aa-91')
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.203 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.204 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.204 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.204 186180 DEBUG nova.compute.manager [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.204 186180 INFO nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Deleting instance files /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55_del
Feb 16 17:39:34 compute-0 nova_compute[186176]: 2026-02-16 17:39:34.205 186180 INFO nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Deletion of /var/lib/nova/instances/e039ddb6-babe-4626-a22d-fe3afde47f55_del complete
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.630 186180 DEBUG nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.630 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.631 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.631 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.631 186180 DEBUG nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] No waiting events found dispatching network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.631 186180 WARNING nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received unexpected event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 for instance with vm_state active and task_state migrating.
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.632 186180 DEBUG nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-unplugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.632 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.632 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.632 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.633 186180 DEBUG nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] No waiting events found dispatching network-vif-unplugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.633 186180 DEBUG nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-unplugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.633 186180 DEBUG nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.633 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.633 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.634 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.634 186180 DEBUG nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] No waiting events found dispatching network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.634 186180 WARNING nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received unexpected event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 for instance with vm_state active and task_state migrating.
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.634 186180 DEBUG nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.635 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.635 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.635 186180 DEBUG oslo_concurrency.lockutils [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.635 186180 DEBUG nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] No waiting events found dispatching network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:39:35 compute-0 nova_compute[186176]: 2026-02-16 17:39:35.635 186180 WARNING nova.compute.manager [req-409b2520-830a-4c57-bfcd-ab56915398df req-0f994a16-fdc2-4df6-8010-a450c0451999 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Received unexpected event network-vif-plugged-f4d3c9aa-91fe-4379-8354-ae62aafad774 for instance with vm_state active and task_state migrating.
Feb 16 17:39:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:38.172 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:38.173 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:39:38.173 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:38 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 17:39:38 compute-0 systemd[211976]: Activating special unit Exit the Session...
Feb 16 17:39:38 compute-0 systemd[211976]: Stopped target Main User Target.
Feb 16 17:39:38 compute-0 systemd[211976]: Stopped target Basic System.
Feb 16 17:39:38 compute-0 systemd[211976]: Stopped target Paths.
Feb 16 17:39:38 compute-0 systemd[211976]: Stopped target Sockets.
Feb 16 17:39:38 compute-0 systemd[211976]: Stopped target Timers.
Feb 16 17:39:38 compute-0 systemd[211976]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:39:38 compute-0 systemd[211976]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 17:39:38 compute-0 systemd[211976]: Closed D-Bus User Message Bus Socket.
Feb 16 17:39:38 compute-0 systemd[211976]: Stopped Create User's Volatile Files and Directories.
Feb 16 17:39:38 compute-0 systemd[211976]: Removed slice User Application Slice.
Feb 16 17:39:38 compute-0 systemd[211976]: Reached target Shutdown.
Feb 16 17:39:38 compute-0 systemd[211976]: Finished Exit the Session.
Feb 16 17:39:38 compute-0 systemd[211976]: Reached target Exit the Session.
Feb 16 17:39:38 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 17:39:38 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 17:39:38 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 17:39:38 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 17:39:38 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 17:39:38 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 17:39:38 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 17:39:38 compute-0 nova_compute[186176]: 2026-02-16 17:39:38.694 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.043 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.044 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.045 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "e039ddb6-babe-4626-a22d-fe3afde47f55-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.074 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.075 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.076 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.076 186180 DEBUG nova.compute.resource_tracker [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.198 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.274 186180 WARNING nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.276 186180 DEBUG nova.compute.resource_tracker [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5810MB free_disk=73.22378921508789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.276 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.276 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.319 186180 DEBUG nova.compute.resource_tracker [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Migration for instance e039ddb6-babe-4626-a22d-fe3afde47f55 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.349 186180 DEBUG nova.compute.resource_tracker [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.534 186180 DEBUG nova.compute.resource_tracker [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Migration ba040bbe-1df3-482d-9f10-650dad157f17 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.535 186180 DEBUG nova.compute.resource_tracker [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.535 186180 DEBUG nova.compute.resource_tracker [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.580 186180 DEBUG nova.compute.provider_tree [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.604 186180 DEBUG nova.scheduler.client.report [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.628 186180 DEBUG nova.compute.resource_tracker [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.629 186180 DEBUG oslo_concurrency.lockutils [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.353s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.637 186180 INFO nova.compute.manager [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.715 186180 INFO nova.scheduler.client.report [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Deleted allocation for migration ba040bbe-1df3-482d-9f10-650dad157f17
Feb 16 17:39:39 compute-0 nova_compute[186176]: 2026-02-16 17:39:39.716 186180 DEBUG nova.virt.libvirt.driver [None req-1f6cb96f-e803-4be1-b9d6-06e3d9dad0f2 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 17:39:43 compute-0 nova_compute[186176]: 2026-02-16 17:39:43.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:39:43 compute-0 nova_compute[186176]: 2026-02-16 17:39:43.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:39:43 compute-0 nova_compute[186176]: 2026-02-16 17:39:43.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:39:43 compute-0 nova_compute[186176]: 2026-02-16 17:39:43.332 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:39:43 compute-0 nova_compute[186176]: 2026-02-16 17:39:43.530 186180 DEBUG nova.compute.manager [None req-4a1d8468-9be7-4e19-89b2-ac90e2bc65bb b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider bb904aac-529f-46ef-9861-9c655a4b383c in placement. update_compute_provider_status /usr/lib/python3.9/site-packages/nova/compute/manager.py:606
Feb 16 17:39:43 compute-0 nova_compute[186176]: 2026-02-16 17:39:43.582 186180 DEBUG nova.compute.provider_tree [None req-4a1d8468-9be7-4e19-89b2-ac90e2bc65bb b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Updating resource provider bb904aac-529f-46ef-9861-9c655a4b383c generation from 26 to 29 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 16 17:39:43 compute-0 nova_compute[186176]: 2026-02-16 17:39:43.696 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:44 compute-0 nova_compute[186176]: 2026-02-16 17:39:44.200 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:44 compute-0 nova_compute[186176]: 2026-02-16 17:39:44.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.340 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.340 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.340 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.341 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.509 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.511 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5808MB free_disk=73.22380828857422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.512 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.512 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.565 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.566 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.580 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.597 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.598 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.613 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.642 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.665 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.687 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.689 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:39:46 compute-0 nova_compute[186176]: 2026-02-16 17:39:46.689 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:39:48 compute-0 nova_compute[186176]: 2026-02-16 17:39:48.320 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771263573.320101, e039ddb6-babe-4626-a22d-fe3afde47f55 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:39:48 compute-0 nova_compute[186176]: 2026-02-16 17:39:48.321 186180 INFO nova.compute.manager [-] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] VM Stopped (Lifecycle Event)
Feb 16 17:39:48 compute-0 nova_compute[186176]: 2026-02-16 17:39:48.343 186180 DEBUG nova.compute.manager [None req-432c98bb-a422-4afa-85f9-0c6792df049f - - - - - -] [instance: e039ddb6-babe-4626-a22d-fe3afde47f55] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:39:48 compute-0 nova_compute[186176]: 2026-02-16 17:39:48.699 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:49 compute-0 nova_compute[186176]: 2026-02-16 17:39:49.234 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:49 compute-0 nova_compute[186176]: 2026-02-16 17:39:49.689 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:39:49 compute-0 nova_compute[186176]: 2026-02-16 17:39:49.690 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:39:49 compute-0 nova_compute[186176]: 2026-02-16 17:39:49.690 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:39:50 compute-0 podman[212156]: 2026-02-16 17:39:50.114441502 +0000 UTC m=+0.076404688 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 17:39:50 compute-0 nova_compute[186176]: 2026-02-16 17:39:50.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:39:50 compute-0 nova_compute[186176]: 2026-02-16 17:39:50.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:39:50 compute-0 nova_compute[186176]: 2026-02-16 17:39:50.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:39:52 compute-0 podman[212177]: 2026-02-16 17:39:52.101475212 +0000 UTC m=+0.067716171 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 16 17:39:53 compute-0 nova_compute[186176]: 2026-02-16 17:39:53.702 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:54 compute-0 nova_compute[186176]: 2026-02-16 17:39:54.236 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:56 compute-0 nova_compute[186176]: 2026-02-16 17:39:56.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:39:58 compute-0 nova_compute[186176]: 2026-02-16 17:39:58.705 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:59 compute-0 podman[212200]: 2026-02-16 17:39:59.118954852 +0000 UTC m=+0.070862511 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 17:39:59 compute-0 podman[212199]: 2026-02-16 17:39:59.128242576 +0000 UTC m=+0.082545863 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 17:39:59 compute-0 nova_compute[186176]: 2026-02-16 17:39:59.239 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:39:59 compute-0 podman[195505]: time="2026-02-16T17:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:39:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:39:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 17:40:01 compute-0 openstack_network_exporter[198360]: ERROR   17:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:40:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:40:01 compute-0 openstack_network_exporter[198360]: ERROR   17:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:40:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:40:03 compute-0 nova_compute[186176]: 2026-02-16 17:40:03.708 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:04 compute-0 nova_compute[186176]: 2026-02-16 17:40:04.283 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:08 compute-0 nova_compute[186176]: 2026-02-16 17:40:08.709 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:09 compute-0 nova_compute[186176]: 2026-02-16 17:40:09.314 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.190 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.191 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.207 186180 DEBUG nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.278 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.278 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.286 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.286 186180 INFO nova.compute.claims [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.384 186180 DEBUG nova.compute.provider_tree [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.396 186180 DEBUG nova.scheduler.client.report [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.411 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.411 186180 DEBUG nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.457 186180 DEBUG nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.457 186180 DEBUG nova.network.neutron [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.476 186180 INFO nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.496 186180 DEBUG nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.590 186180 DEBUG nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.592 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.592 186180 INFO nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Creating image(s)
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.593 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "/var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.593 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.594 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.611 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.686 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.687 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.688 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.702 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.751 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.767 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.768 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.794 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.795 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.796 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.861 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.862 186180 DEBUG nova.virt.disk.api [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Checking if we can resize image /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.863 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.916 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.918 186180 DEBUG nova.virt.disk.api [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Cannot resize image /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.919 186180 DEBUG nova.objects.instance [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'migration_context' on Instance uuid 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.936 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.937 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Ensure instance console log exists: /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.937 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.938 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:13 compute-0 nova_compute[186176]: 2026-02-16 17:40:13.939 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:14 compute-0 nova_compute[186176]: 2026-02-16 17:40:14.000 186180 DEBUG nova.policy [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54934f49b2044289bcf127662fe114b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:40:14 compute-0 nova_compute[186176]: 2026-02-16 17:40:14.317 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:14 compute-0 nova_compute[186176]: 2026-02-16 17:40:14.564 186180 DEBUG nova.network.neutron [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Successfully created port: cfacb3ed-a217-4bea-ac84-0fb16e1fa1af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:40:15 compute-0 sshd-session[212154]: Connection reset by authenticating user root 20.160.107.24 port 40180 [preauth]
Feb 16 17:40:15 compute-0 nova_compute[186176]: 2026-02-16 17:40:15.470 186180 DEBUG nova.network.neutron [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Successfully updated port: cfacb3ed-a217-4bea-ac84-0fb16e1fa1af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:40:15 compute-0 nova_compute[186176]: 2026-02-16 17:40:15.498 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "refresh_cache-6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:40:15 compute-0 nova_compute[186176]: 2026-02-16 17:40:15.499 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquired lock "refresh_cache-6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:40:15 compute-0 nova_compute[186176]: 2026-02-16 17:40:15.499 186180 DEBUG nova.network.neutron [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:40:15 compute-0 nova_compute[186176]: 2026-02-16 17:40:15.570 186180 DEBUG nova.compute.manager [req-6b66a353-fd57-4aa7-bdd1-b9ddce8ebe22 req-1333e8bc-90bf-4d18-bcc9-7bf56e04e1e8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Received event network-changed-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:40:15 compute-0 nova_compute[186176]: 2026-02-16 17:40:15.571 186180 DEBUG nova.compute.manager [req-6b66a353-fd57-4aa7-bdd1-b9ddce8ebe22 req-1333e8bc-90bf-4d18-bcc9-7bf56e04e1e8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Refreshing instance network info cache due to event network-changed-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:40:15 compute-0 nova_compute[186176]: 2026-02-16 17:40:15.571 186180 DEBUG oslo_concurrency.lockutils [req-6b66a353-fd57-4aa7-bdd1-b9ddce8ebe22 req-1333e8bc-90bf-4d18-bcc9-7bf56e04e1e8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:40:15 compute-0 nova_compute[186176]: 2026-02-16 17:40:15.672 186180 DEBUG nova.network.neutron [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.592 186180 DEBUG nova.network.neutron [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Updating instance_info_cache with network_info: [{"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.609 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Releasing lock "refresh_cache-6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.610 186180 DEBUG nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Instance network_info: |[{"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.610 186180 DEBUG oslo_concurrency.lockutils [req-6b66a353-fd57-4aa7-bdd1-b9ddce8ebe22 req-1333e8bc-90bf-4d18-bcc9-7bf56e04e1e8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.610 186180 DEBUG nova.network.neutron [req-6b66a353-fd57-4aa7-bdd1-b9ddce8ebe22 req-1333e8bc-90bf-4d18-bcc9-7bf56e04e1e8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Refreshing network info cache for port cfacb3ed-a217-4bea-ac84-0fb16e1fa1af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.614 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Start _get_guest_xml network_info=[{"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.618 186180 WARNING nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.628 186180 DEBUG nova.virt.libvirt.host [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.629 186180 DEBUG nova.virt.libvirt.host [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.632 186180 DEBUG nova.virt.libvirt.host [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.633 186180 DEBUG nova.virt.libvirt.host [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.634 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.634 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.635 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.635 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.636 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.636 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.636 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.636 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.637 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.637 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.637 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.638 186180 DEBUG nova.virt.hardware [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.642 186180 DEBUG nova.virt.libvirt.vif [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:40:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2143366756',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2143366756',id=17,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-12tqpd9i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:40:13Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=6fc932c5-7baf-4d8d-a2fa-6b79e4937c72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.643 186180 DEBUG nova.network.os_vif_util [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.644 186180 DEBUG nova.network.os_vif_util [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:90:49,bridge_name='br-int',has_traffic_filtering=True,id=cfacb3ed-a217-4bea-ac84-0fb16e1fa1af,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfacb3ed-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.645 186180 DEBUG nova.objects.instance [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'pci_devices' on Instance uuid 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.660 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <uuid>6fc932c5-7baf-4d8d-a2fa-6b79e4937c72</uuid>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <name>instance-00000011</name>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteStrategies-server-2143366756</nova:name>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:40:16</nova:creationTime>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:40:16 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:40:16 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:40:16 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:40:16 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:40:16 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:40:16 compute-0 nova_compute[186176]:         <nova:user uuid="c54934f49b2044289bcf127662fe114b">tempest-TestExecuteStrategies-1098930400-project-member</nova:user>
Feb 16 17:40:16 compute-0 nova_compute[186176]:         <nova:project uuid="1a237c4b00c5426cb1dc6afe3c7c868c">tempest-TestExecuteStrategies-1098930400</nova:project>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:40:16 compute-0 nova_compute[186176]:         <nova:port uuid="cfacb3ed-a217-4bea-ac84-0fb16e1fa1af">
Feb 16 17:40:16 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <system>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <entry name="serial">6fc932c5-7baf-4d8d-a2fa-6b79e4937c72</entry>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <entry name="uuid">6fc932c5-7baf-4d8d-a2fa-6b79e4937c72</entry>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     </system>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <os>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   </os>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <features>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   </features>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk.config"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:00:90:49"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <target dev="tapcfacb3ed-a2"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/console.log" append="off"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <video>
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     </video>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:40:16 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:40:16 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:40:16 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:40:16 compute-0 nova_compute[186176]: </domain>
Feb 16 17:40:16 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.661 186180 DEBUG nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Preparing to wait for external event network-vif-plugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.661 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.661 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.662 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.663 186180 DEBUG nova.virt.libvirt.vif [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:40:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2143366756',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2143366756',id=17,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-12tqpd9i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:40:13Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=6fc932c5-7baf-4d8d-a2fa-6b79e4937c72,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.663 186180 DEBUG nova.network.os_vif_util [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.664 186180 DEBUG nova.network.os_vif_util [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:90:49,bridge_name='br-int',has_traffic_filtering=True,id=cfacb3ed-a217-4bea-ac84-0fb16e1fa1af,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfacb3ed-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.664 186180 DEBUG os_vif [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:90:49,bridge_name='br-int',has_traffic_filtering=True,id=cfacb3ed-a217-4bea-ac84-0fb16e1fa1af,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfacb3ed-a2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.665 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.665 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.666 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.671 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.671 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcfacb3ed-a2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.672 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcfacb3ed-a2, col_values=(('external_ids', {'iface-id': 'cfacb3ed-a217-4bea-ac84-0fb16e1fa1af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:90:49', 'vm-uuid': '6fc932c5-7baf-4d8d-a2fa-6b79e4937c72'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.674 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.676 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:40:16 compute-0 NetworkManager[56463]: <info>  [1771263616.6769] manager: (tapcfacb3ed-a2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.682 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.683 186180 INFO os_vif [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:90:49,bridge_name='br-int',has_traffic_filtering=True,id=cfacb3ed-a217-4bea-ac84-0fb16e1fa1af,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfacb3ed-a2')
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.737 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.737 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.737 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No VIF found with MAC fa:16:3e:00:90:49, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:40:16 compute-0 nova_compute[186176]: 2026-02-16 17:40:16.738 186180 INFO nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Using config drive
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.075 186180 INFO nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Creating config drive at /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk.config
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.083 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpv6tubb2o execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.211 186180 DEBUG oslo_concurrency.processutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpv6tubb2o" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:40:17 compute-0 kernel: tapcfacb3ed-a2: entered promiscuous mode
Feb 16 17:40:17 compute-0 NetworkManager[56463]: <info>  [1771263617.2679] manager: (tapcfacb3ed-a2): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.269 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:17 compute-0 ovn_controller[96437]: 2026-02-16T17:40:17Z|00135|binding|INFO|Claiming lport cfacb3ed-a217-4bea-ac84-0fb16e1fa1af for this chassis.
Feb 16 17:40:17 compute-0 ovn_controller[96437]: 2026-02-16T17:40:17Z|00136|binding|INFO|cfacb3ed-a217-4bea-ac84-0fb16e1fa1af: Claiming fa:16:3e:00:90:49 10.100.0.3
Feb 16 17:40:17 compute-0 ovn_controller[96437]: 2026-02-16T17:40:17Z|00137|binding|INFO|Setting lport cfacb3ed-a217-4bea-ac84-0fb16e1fa1af ovn-installed in OVS
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.275 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.279 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:17 compute-0 ovn_controller[96437]: 2026-02-16T17:40:17Z|00138|binding|INFO|Setting lport cfacb3ed-a217-4bea-ac84-0fb16e1fa1af up in Southbound
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.281 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:90:49 10.100.0.3'], port_security=['fa:16:3e:00:90:49 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6fc932c5-7baf-4d8d-a2fa-6b79e4937c72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=cfacb3ed-a217-4bea-ac84-0fb16e1fa1af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.282 105730 INFO neutron.agent.ovn.metadata.agent [-] Port cfacb3ed-a217-4bea-ac84-0fb16e1fa1af in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 bound to our chassis
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.283 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.293 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[8d03e1d9-6b4e-4778-841d-95d90abbcca4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.294 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap94cafcd0-c1 in ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.296 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap94cafcd0-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.296 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[907b545b-f8aa-49af-8006-b8ed712a9287]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.297 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[cb97f1a6-30b1-489d-a6f3-59b2c55575ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 systemd-machined[155631]: New machine qemu-13-instance-00000011.
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.307 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[3b919b69-cffc-4784-b097-514b0e4fc94d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000011.
Feb 16 17:40:17 compute-0 systemd-udevd[212288]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.330 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[62463781-1c15-43c7-8a8c-c0acbb48d676]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 NetworkManager[56463]: <info>  [1771263617.3339] device (tapcfacb3ed-a2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:40:17 compute-0 NetworkManager[56463]: <info>  [1771263617.3348] device (tapcfacb3ed-a2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.363 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[68f32c94-cb23-419d-abc7-44f2560451f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 NetworkManager[56463]: <info>  [1771263617.3721] manager: (tap94cafcd0-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/62)
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.372 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[2619891b-f690-4621-bba1-6e229dc7ef37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.401 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[7602612a-27a9-4895-9bf6-adecd7c037ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.404 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[87f850dc-ff2f-44eb-a611-64a3bd4b2adf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 NetworkManager[56463]: <info>  [1771263617.4189] device (tap94cafcd0-c0): carrier: link connected
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.421 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[83ca829c-b8f2-4696-bf89-052435c24725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.432 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[79ce45f1-b64a-4269-bcf1-9ca1cb71c309]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521294, 'reachable_time': 24460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212318, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.439 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[1230c8ff-d71a-49c4-8ae9-4c07f747c3cc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 521294, 'tstamp': 521294}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212319, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.450 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f45b6805-ec8f-40b3-bb1c-8b9d20c74a52]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521294, 'reachable_time': 24460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212320, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.468 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[cd44cc61-748f-4943-9b96-f2a795b37b49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.496 186180 DEBUG nova.compute.manager [req-79eabdf1-b440-4a16-951a-6c6662ab3349 req-6a92f80a-bb6b-40b6-961b-05819d52a5b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Received event network-vif-plugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.496 186180 DEBUG oslo_concurrency.lockutils [req-79eabdf1-b440-4a16-951a-6c6662ab3349 req-6a92f80a-bb6b-40b6-961b-05819d52a5b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.496 186180 DEBUG oslo_concurrency.lockutils [req-79eabdf1-b440-4a16-951a-6c6662ab3349 req-6a92f80a-bb6b-40b6-961b-05819d52a5b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.496 186180 DEBUG oslo_concurrency.lockutils [req-79eabdf1-b440-4a16-951a-6c6662ab3349 req-6a92f80a-bb6b-40b6-961b-05819d52a5b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.497 186180 DEBUG nova.compute.manager [req-79eabdf1-b440-4a16-951a-6c6662ab3349 req-6a92f80a-bb6b-40b6-961b-05819d52a5b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Processing event network-vif-plugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.512 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[23a3b2f2-b1be-439c-a5ea-18d2c5238e74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.513 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.513 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.514 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.515 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:17 compute-0 NetworkManager[56463]: <info>  [1771263617.5165] manager: (tap94cafcd0-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Feb 16 17:40:17 compute-0 kernel: tap94cafcd0-c0: entered promiscuous mode
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.517 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.518 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.519 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.520 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:17 compute-0 ovn_controller[96437]: 2026-02-16T17:40:17Z|00139|binding|INFO|Releasing lport 5c28d585-b48c-40c6-b5e7-f1e59317b2de from this chassis (sb_readonly=0)
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.521 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.524 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.525 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3bac0bff-f104-469f-8526-7c84fa2e5134]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.525 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:40:17 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:17.527 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'env', 'PROCESS_TAG=haproxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.662 186180 DEBUG nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.663 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263617.66251, 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.664 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] VM Started (Lifecycle Event)
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.671 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.674 186180 INFO nova.virt.libvirt.driver [-] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Instance spawned successfully.
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.674 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.677 186180 DEBUG nova.network.neutron [req-6b66a353-fd57-4aa7-bdd1-b9ddce8ebe22 req-1333e8bc-90bf-4d18-bcc9-7bf56e04e1e8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Updated VIF entry in instance network info cache for port cfacb3ed-a217-4bea-ac84-0fb16e1fa1af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.678 186180 DEBUG nova.network.neutron [req-6b66a353-fd57-4aa7-bdd1-b9ddce8ebe22 req-1333e8bc-90bf-4d18-bcc9-7bf56e04e1e8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Updating instance_info_cache with network_info: [{"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.691 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.694 186180 DEBUG oslo_concurrency.lockutils [req-6b66a353-fd57-4aa7-bdd1-b9ddce8ebe22 req-1333e8bc-90bf-4d18-bcc9-7bf56e04e1e8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.696 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.704 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.705 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.705 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.706 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.706 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.706 186180 DEBUG nova.virt.libvirt.driver [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.732 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.733 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263617.6626053, 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.733 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] VM Paused (Lifecycle Event)
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.760 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.763 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263617.668519, 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.764 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] VM Resumed (Lifecycle Event)
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.769 186180 INFO nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Took 4.18 seconds to spawn the instance on the hypervisor.
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.769 186180 DEBUG nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.778 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.781 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.807 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.836 186180 INFO nova.compute.manager [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Took 4.58 seconds to build instance.
Feb 16 17:40:17 compute-0 nova_compute[186176]: 2026-02-16 17:40:17.852 186180 DEBUG oslo_concurrency.lockutils [None req-4ab30827-c18f-4de7-8934-0cee340301d0 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:17 compute-0 podman[212359]: 2026-02-16 17:40:17.888365102 +0000 UTC m=+0.052723444 container create 5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 16 17:40:17 compute-0 systemd[1]: Started libpod-conmon-5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436.scope.
Feb 16 17:40:17 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:40:17 compute-0 podman[212359]: 2026-02-16 17:40:17.86261453 +0000 UTC m=+0.026972912 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:40:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1c2cd5dddad42fb9311bfd638a0e1d0a93e04ec3c63edbb74d5e78d4d8ba50e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:40:17 compute-0 podman[212359]: 2026-02-16 17:40:17.973138638 +0000 UTC m=+0.137496980 container init 5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 16 17:40:17 compute-0 podman[212359]: 2026-02-16 17:40:17.978790824 +0000 UTC m=+0.143149176 container start 5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:40:17 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[212374]: [NOTICE]   (212378) : New worker (212380) forked
Feb 16 17:40:17 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[212374]: [NOTICE]   (212378) : Loading success.
Feb 16 17:40:18 compute-0 nova_compute[186176]: 2026-02-16 17:40:18.753 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:19 compute-0 nova_compute[186176]: 2026-02-16 17:40:19.575 186180 DEBUG nova.compute.manager [req-c7bd1b24-837a-4296-9068-55fff5d768c2 req-f7bafe89-5b35-49f9-8d8e-9c54e0b61a98 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Received event network-vif-plugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:40:19 compute-0 nova_compute[186176]: 2026-02-16 17:40:19.575 186180 DEBUG oslo_concurrency.lockutils [req-c7bd1b24-837a-4296-9068-55fff5d768c2 req-f7bafe89-5b35-49f9-8d8e-9c54e0b61a98 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:19 compute-0 nova_compute[186176]: 2026-02-16 17:40:19.575 186180 DEBUG oslo_concurrency.lockutils [req-c7bd1b24-837a-4296-9068-55fff5d768c2 req-f7bafe89-5b35-49f9-8d8e-9c54e0b61a98 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:19 compute-0 nova_compute[186176]: 2026-02-16 17:40:19.575 186180 DEBUG oslo_concurrency.lockutils [req-c7bd1b24-837a-4296-9068-55fff5d768c2 req-f7bafe89-5b35-49f9-8d8e-9c54e0b61a98 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:19 compute-0 nova_compute[186176]: 2026-02-16 17:40:19.575 186180 DEBUG nova.compute.manager [req-c7bd1b24-837a-4296-9068-55fff5d768c2 req-f7bafe89-5b35-49f9-8d8e-9c54e0b61a98 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] No waiting events found dispatching network-vif-plugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:40:19 compute-0 nova_compute[186176]: 2026-02-16 17:40:19.576 186180 WARNING nova.compute.manager [req-c7bd1b24-837a-4296-9068-55fff5d768c2 req-f7bafe89-5b35-49f9-8d8e-9c54e0b61a98 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Received unexpected event network-vif-plugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af for instance with vm_state active and task_state None.
Feb 16 17:40:21 compute-0 podman[212389]: 2026-02-16 17:40:21.123940149 +0000 UTC m=+0.089014129 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Feb 16 17:40:21 compute-0 nova_compute[186176]: 2026-02-16 17:40:21.674 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:23 compute-0 podman[212410]: 2026-02-16 17:40:23.094875942 +0000 UTC m=+0.064851586 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 16 17:40:23 compute-0 nova_compute[186176]: 2026-02-16 17:40:23.755 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:26 compute-0 nova_compute[186176]: 2026-02-16 17:40:26.676 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:28 compute-0 nova_compute[186176]: 2026-02-16 17:40:28.756 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:29 compute-0 ovn_controller[96437]: 2026-02-16T17:40:29Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:00:90:49 10.100.0.3
Feb 16 17:40:29 compute-0 ovn_controller[96437]: 2026-02-16T17:40:29Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:00:90:49 10.100.0.3
Feb 16 17:40:29 compute-0 podman[195505]: time="2026-02-16T17:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:40:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:40:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2639 "" "Go-http-client/1.1"
Feb 16 17:40:29 compute-0 podman[212450]: 2026-02-16 17:40:29.851861936 +0000 UTC m=+0.072182133 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:40:29 compute-0 podman[212449]: 2026-02-16 17:40:29.903846111 +0000 UTC m=+0.127884008 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Feb 16 17:40:31 compute-0 openstack_network_exporter[198360]: ERROR   17:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:40:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:40:31 compute-0 openstack_network_exporter[198360]: ERROR   17:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:40:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:40:31 compute-0 nova_compute[186176]: 2026-02-16 17:40:31.678 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:32 compute-0 nova_compute[186176]: 2026-02-16 17:40:32.763 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:32 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:32.763 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:40:32 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:32.765 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:40:33 compute-0 nova_compute[186176]: 2026-02-16 17:40:33.759 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:36 compute-0 nova_compute[186176]: 2026-02-16 17:40:36.681 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:38.173 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:38.173 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:38.174 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:38 compute-0 nova_compute[186176]: 2026-02-16 17:40:38.761 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:40:39.769 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:40:41 compute-0 nova_compute[186176]: 2026-02-16 17:40:41.683 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:43 compute-0 nova_compute[186176]: 2026-02-16 17:40:43.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:40:43 compute-0 nova_compute[186176]: 2026-02-16 17:40:43.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:40:43 compute-0 nova_compute[186176]: 2026-02-16 17:40:43.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:40:43 compute-0 nova_compute[186176]: 2026-02-16 17:40:43.764 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:43 compute-0 nova_compute[186176]: 2026-02-16 17:40:43.971 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:40:43 compute-0 nova_compute[186176]: 2026-02-16 17:40:43.971 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:40:43 compute-0 nova_compute[186176]: 2026-02-16 17:40:43.972 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:40:43 compute-0 nova_compute[186176]: 2026-02-16 17:40:43.972 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:40:45 compute-0 nova_compute[186176]: 2026-02-16 17:40:45.457 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Updating instance_info_cache with network_info: [{"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:40:45 compute-0 nova_compute[186176]: 2026-02-16 17:40:45.510 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:40:45 compute-0 nova_compute[186176]: 2026-02-16 17:40:45.510 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.346 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.346 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.347 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.347 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.430 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.513 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.515 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.604 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.686 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.814 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.816 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5657MB free_disk=73.1949234008789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.816 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.817 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.910 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.911 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.911 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.957 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:40:46 compute-0 nova_compute[186176]: 2026-02-16 17:40:46.973 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:40:47 compute-0 nova_compute[186176]: 2026-02-16 17:40:47.000 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:40:47 compute-0 nova_compute[186176]: 2026-02-16 17:40:47.001 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:40:48 compute-0 nova_compute[186176]: 2026-02-16 17:40:48.805 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:50 compute-0 nova_compute[186176]: 2026-02-16 17:40:50.003 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:40:50 compute-0 nova_compute[186176]: 2026-02-16 17:40:50.004 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:40:50 compute-0 nova_compute[186176]: 2026-02-16 17:40:50.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:40:50 compute-0 nova_compute[186176]: 2026-02-16 17:40:50.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:40:51 compute-0 nova_compute[186176]: 2026-02-16 17:40:51.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:40:51 compute-0 nova_compute[186176]: 2026-02-16 17:40:51.356 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:40:51 compute-0 nova_compute[186176]: 2026-02-16 17:40:51.689 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:52 compute-0 podman[212505]: 2026-02-16 17:40:52.104260585 +0000 UTC m=+0.081104579 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 17:40:52 compute-0 nova_compute[186176]: 2026-02-16 17:40:52.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:40:53 compute-0 nova_compute[186176]: 2026-02-16 17:40:53.807 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:54 compute-0 podman[212526]: 2026-02-16 17:40:54.09271612 +0000 UTC m=+0.062137341 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 16 17:40:55 compute-0 ovn_controller[96437]: 2026-02-16T17:40:55Z|00140|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Feb 16 17:40:56 compute-0 nova_compute[186176]: 2026-02-16 17:40:56.691 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:58 compute-0 nova_compute[186176]: 2026-02-16 17:40:58.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:40:58 compute-0 nova_compute[186176]: 2026-02-16 17:40:58.809 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:40:59 compute-0 podman[195505]: time="2026-02-16T17:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:40:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:40:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 17:41:00 compute-0 podman[212547]: 2026-02-16 17:41:00.090893638 +0000 UTC m=+0.052674562 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:41:00 compute-0 podman[212546]: 2026-02-16 17:41:00.173919302 +0000 UTC m=+0.137034979 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:41:01 compute-0 openstack_network_exporter[198360]: ERROR   17:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:41:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:41:01 compute-0 openstack_network_exporter[198360]: ERROR   17:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:41:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:41:01 compute-0 nova_compute[186176]: 2026-02-16 17:41:01.693 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:03 compute-0 nova_compute[186176]: 2026-02-16 17:41:03.856 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:06 compute-0 nova_compute[186176]: 2026-02-16 17:41:06.695 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:08 compute-0 nova_compute[186176]: 2026-02-16 17:41:08.862 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:11 compute-0 nova_compute[186176]: 2026-02-16 17:41:11.697 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:13 compute-0 nova_compute[186176]: 2026-02-16 17:41:13.863 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:15 compute-0 nova_compute[186176]: 2026-02-16 17:41:15.979 186180 DEBUG nova.virt.libvirt.driver [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Creating tmpfile /var/lib/nova/instances/tmpoawxk23p to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 17:41:16 compute-0 nova_compute[186176]: 2026-02-16 17:41:16.091 186180 DEBUG nova.compute.manager [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoawxk23p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 17:41:16 compute-0 nova_compute[186176]: 2026-02-16 17:41:16.722 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:17 compute-0 nova_compute[186176]: 2026-02-16 17:41:17.534 186180 DEBUG nova.compute.manager [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoawxk23p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b9a7a5c-0412-4863-b9d5-5de81954691e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 17:41:17 compute-0 nova_compute[186176]: 2026-02-16 17:41:17.557 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-6b9a7a5c-0412-4863-b9d5-5de81954691e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:41:17 compute-0 nova_compute[186176]: 2026-02-16 17:41:17.558 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-6b9a7a5c-0412-4863-b9d5-5de81954691e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:41:17 compute-0 nova_compute[186176]: 2026-02-16 17:41:17.558 186180 DEBUG nova.network.neutron [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:41:18 compute-0 nova_compute[186176]: 2026-02-16 17:41:18.902 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.297 186180 DEBUG nova.network.neutron [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Updating instance_info_cache with network_info: [{"id": "599e818c-6eef-46a1-9126-574260b721e3", "address": "fa:16:3e:21:8d:c3", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap599e818c-6e", "ovs_interfaceid": "599e818c-6eef-46a1-9126-574260b721e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.315 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-6b9a7a5c-0412-4863-b9d5-5de81954691e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.318 186180 DEBUG nova.virt.libvirt.driver [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoawxk23p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b9a7a5c-0412-4863-b9d5-5de81954691e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.319 186180 DEBUG nova.virt.libvirt.driver [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Creating instance directory: /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.320 186180 DEBUG nova.virt.libvirt.driver [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Creating disk.info with the contents: {'/var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk': 'qcow2', '/var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.320 186180 DEBUG nova.virt.libvirt.driver [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.320 186180 DEBUG nova.objects.instance [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6b9a7a5c-0412-4863-b9d5-5de81954691e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.362 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.422 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.424 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.425 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.443 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.496 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.497 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.534 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.535 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.536 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.602 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.603 186180 DEBUG nova.virt.disk.api [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Checking if we can resize image /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.603 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.679 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.681 186180 DEBUG nova.virt.disk.api [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Cannot resize image /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.681 186180 DEBUG nova.objects.instance [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid 6b9a7a5c-0412-4863-b9d5-5de81954691e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.699 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.722 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk.config 485376" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.725 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk.config to /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 17:41:19 compute-0 nova_compute[186176]: 2026-02-16 17:41:19.726 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk.config /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.132 186180 DEBUG oslo_concurrency.processutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e/disk.config /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.133 186180 DEBUG nova.virt.libvirt.driver [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.135 186180 DEBUG nova.virt.libvirt.vif [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:40:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1081268620',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1081268620',id=18,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:40:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-ct66219r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:40:33Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=6b9a7a5c-0412-4863-b9d5-5de81954691e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "599e818c-6eef-46a1-9126-574260b721e3", "address": "fa:16:3e:21:8d:c3", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap599e818c-6e", "ovs_interfaceid": "599e818c-6eef-46a1-9126-574260b721e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.136 186180 DEBUG nova.network.os_vif_util [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "599e818c-6eef-46a1-9126-574260b721e3", "address": "fa:16:3e:21:8d:c3", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap599e818c-6e", "ovs_interfaceid": "599e818c-6eef-46a1-9126-574260b721e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.137 186180 DEBUG nova.network.os_vif_util [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:21:8d:c3,bridge_name='br-int',has_traffic_filtering=True,id=599e818c-6eef-46a1-9126-574260b721e3,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap599e818c-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.138 186180 DEBUG os_vif [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:8d:c3,bridge_name='br-int',has_traffic_filtering=True,id=599e818c-6eef-46a1-9126-574260b721e3,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap599e818c-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.139 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.139 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.140 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.144 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.145 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap599e818c-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.146 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap599e818c-6e, col_values=(('external_ids', {'iface-id': '599e818c-6eef-46a1-9126-574260b721e3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:8d:c3', 'vm-uuid': '6b9a7a5c-0412-4863-b9d5-5de81954691e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.148 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:20 compute-0 NetworkManager[56463]: <info>  [1771263680.1489] manager: (tap599e818c-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.151 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.154 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.155 186180 INFO os_vif [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:8d:c3,bridge_name='br-int',has_traffic_filtering=True,id=599e818c-6eef-46a1-9126-574260b721e3,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap599e818c-6e')
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.156 186180 DEBUG nova.virt.libvirt.driver [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 17:41:20 compute-0 nova_compute[186176]: 2026-02-16 17:41:20.157 186180 DEBUG nova.compute.manager [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoawxk23p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b9a7a5c-0412-4863-b9d5-5de81954691e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 17:41:22 compute-0 nova_compute[186176]: 2026-02-16 17:41:22.990 186180 DEBUG nova.network.neutron [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Port 599e818c-6eef-46a1-9126-574260b721e3 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 17:41:22 compute-0 nova_compute[186176]: 2026-02-16 17:41:22.992 186180 DEBUG nova.compute.manager [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpoawxk23p',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b9a7a5c-0412-4863-b9d5-5de81954691e',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 17:41:23 compute-0 podman[212619]: 2026-02-16 17:41:23.126318589 +0000 UTC m=+0.089975583 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347, version=9.7, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc.)
Feb 16 17:41:23 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 17:41:23 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 17:41:23 compute-0 NetworkManager[56463]: <info>  [1771263683.4195] manager: (tap599e818c-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Feb 16 17:41:23 compute-0 kernel: tap599e818c-6e: entered promiscuous mode
Feb 16 17:41:23 compute-0 nova_compute[186176]: 2026-02-16 17:41:23.422 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:23 compute-0 ovn_controller[96437]: 2026-02-16T17:41:23Z|00141|binding|INFO|Claiming lport 599e818c-6eef-46a1-9126-574260b721e3 for this additional chassis.
Feb 16 17:41:23 compute-0 ovn_controller[96437]: 2026-02-16T17:41:23Z|00142|binding|INFO|599e818c-6eef-46a1-9126-574260b721e3: Claiming fa:16:3e:21:8d:c3 10.100.0.10
Feb 16 17:41:23 compute-0 ovn_controller[96437]: 2026-02-16T17:41:23Z|00143|binding|INFO|Setting lport 599e818c-6eef-46a1-9126-574260b721e3 ovn-installed in OVS
Feb 16 17:41:23 compute-0 nova_compute[186176]: 2026-02-16 17:41:23.428 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:23 compute-0 systemd-machined[155631]: New machine qemu-14-instance-00000012.
Feb 16 17:41:23 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000012.
Feb 16 17:41:23 compute-0 systemd-udevd[212675]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:41:23 compute-0 NetworkManager[56463]: <info>  [1771263683.5002] device (tap599e818c-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:41:23 compute-0 NetworkManager[56463]: <info>  [1771263683.5007] device (tap599e818c-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:41:23 compute-0 nova_compute[186176]: 2026-02-16 17:41:23.905 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:24 compute-0 nova_compute[186176]: 2026-02-16 17:41:24.226 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263684.225852, 6b9a7a5c-0412-4863-b9d5-5de81954691e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:41:24 compute-0 nova_compute[186176]: 2026-02-16 17:41:24.226 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] VM Started (Lifecycle Event)
Feb 16 17:41:24 compute-0 nova_compute[186176]: 2026-02-16 17:41:24.318 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:41:24 compute-0 nova_compute[186176]: 2026-02-16 17:41:24.860 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263684.85975, 6b9a7a5c-0412-4863-b9d5-5de81954691e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:41:24 compute-0 nova_compute[186176]: 2026-02-16 17:41:24.861 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] VM Resumed (Lifecycle Event)
Feb 16 17:41:24 compute-0 nova_compute[186176]: 2026-02-16 17:41:24.899 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:41:24 compute-0 nova_compute[186176]: 2026-02-16 17:41:24.903 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:41:24 compute-0 nova_compute[186176]: 2026-02-16 17:41:24.929 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 17:41:25 compute-0 podman[212704]: 2026-02-16 17:41:25.102896808 +0000 UTC m=+0.072170623 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:41:25 compute-0 nova_compute[186176]: 2026-02-16 17:41:25.148 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:26 compute-0 ovn_controller[96437]: 2026-02-16T17:41:26Z|00144|binding|INFO|Claiming lport 599e818c-6eef-46a1-9126-574260b721e3 for this chassis.
Feb 16 17:41:26 compute-0 ovn_controller[96437]: 2026-02-16T17:41:26Z|00145|binding|INFO|599e818c-6eef-46a1-9126-574260b721e3: Claiming fa:16:3e:21:8d:c3 10.100.0.10
Feb 16 17:41:26 compute-0 ovn_controller[96437]: 2026-02-16T17:41:26Z|00146|binding|INFO|Setting lport 599e818c-6eef-46a1-9126-574260b721e3 up in Southbound
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.325 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:8d:c3 10.100.0.10'], port_security=['fa:16:3e:21:8d:c3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6b9a7a5c-0412-4863-b9d5-5de81954691e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '11', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=599e818c-6eef-46a1-9126-574260b721e3) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.326 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 599e818c-6eef-46a1-9126-574260b721e3 in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 bound to our chassis
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.327 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.346 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[6c0411a7-6b80-411d-9e6b-42323a134327]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.381 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[9d9be1c8-057a-4dcc-a4b6-354f3a928c6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.387 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[c8f658e7-f894-48a1-9e9a-df4c02fe282c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.413 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[de19010e-5b3a-4027-9ea3-7d4f9e0f6107]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.438 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[191864a9-eba9-4239-a165-dd4f30e96d07]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521294, 'reachable_time': 24460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212729, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.461 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[88d2a665-c04a-400e-9f13-fbe7673ce649]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap94cafcd0-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 521301, 'tstamp': 521301}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212730, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap94cafcd0-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 521303, 'tstamp': 521303}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212730, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.464 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:26 compute-0 nova_compute[186176]: 2026-02-16 17:41:26.466 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.467 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.468 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.469 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:26 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:26.469 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:41:26 compute-0 nova_compute[186176]: 2026-02-16 17:41:26.645 186180 INFO nova.compute.manager [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Post operation of migration started
Feb 16 17:41:27 compute-0 nova_compute[186176]: 2026-02-16 17:41:27.257 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-6b9a7a5c-0412-4863-b9d5-5de81954691e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:41:27 compute-0 nova_compute[186176]: 2026-02-16 17:41:27.258 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-6b9a7a5c-0412-4863-b9d5-5de81954691e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:41:27 compute-0 nova_compute[186176]: 2026-02-16 17:41:27.258 186180 DEBUG nova.network.neutron [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:41:28 compute-0 nova_compute[186176]: 2026-02-16 17:41:28.908 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:29 compute-0 podman[195505]: time="2026-02-16T17:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:41:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:41:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2636 "" "Go-http-client/1.1"
Feb 16 17:41:30 compute-0 nova_compute[186176]: 2026-02-16 17:41:30.151 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:31 compute-0 podman[212732]: 2026-02-16 17:41:31.111224772 +0000 UTC m=+0.072878580 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:41:31 compute-0 podman[212731]: 2026-02-16 17:41:31.1596041 +0000 UTC m=+0.121514324 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller)
Feb 16 17:41:31 compute-0 openstack_network_exporter[198360]: ERROR   17:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:41:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:41:31 compute-0 openstack_network_exporter[198360]: ERROR   17:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:41:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:41:32 compute-0 nova_compute[186176]: 2026-02-16 17:41:32.082 186180 DEBUG nova.network.neutron [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Updating instance_info_cache with network_info: [{"id": "599e818c-6eef-46a1-9126-574260b721e3", "address": "fa:16:3e:21:8d:c3", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap599e818c-6e", "ovs_interfaceid": "599e818c-6eef-46a1-9126-574260b721e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:41:32 compute-0 nova_compute[186176]: 2026-02-16 17:41:32.920 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-6b9a7a5c-0412-4863-b9d5-5de81954691e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:41:32 compute-0 nova_compute[186176]: 2026-02-16 17:41:32.944 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:32 compute-0 nova_compute[186176]: 2026-02-16 17:41:32.945 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:32 compute-0 nova_compute[186176]: 2026-02-16 17:41:32.946 186180 DEBUG oslo_concurrency.lockutils [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:32 compute-0 nova_compute[186176]: 2026-02-16 17:41:32.952 186180 INFO nova.virt.libvirt.driver [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 17:41:32 compute-0 virtqemud[185389]: Domain id=14 name='instance-00000012' uuid=6b9a7a5c-0412-4863-b9d5-5de81954691e is tainted: custom-monitor
Feb 16 17:41:33 compute-0 nova_compute[186176]: 2026-02-16 17:41:33.911 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:33 compute-0 nova_compute[186176]: 2026-02-16 17:41:33.960 186180 INFO nova.virt.libvirt.driver [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 17:41:34 compute-0 nova_compute[186176]: 2026-02-16 17:41:34.968 186180 INFO nova.virt.libvirt.driver [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 17:41:34 compute-0 nova_compute[186176]: 2026-02-16 17:41:34.975 186180 DEBUG nova.compute.manager [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:41:35 compute-0 nova_compute[186176]: 2026-02-16 17:41:35.036 186180 DEBUG nova.objects.instance [None req-c5f79013-df27-4fe9-856a-42b4787dcff4 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 17:41:35 compute-0 nova_compute[186176]: 2026-02-16 17:41:35.153 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.173 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.174 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.174 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.423 186180 DEBUG oslo_concurrency.lockutils [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "6b9a7a5c-0412-4863-b9d5-5de81954691e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.424 186180 DEBUG oslo_concurrency.lockutils [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6b9a7a5c-0412-4863-b9d5-5de81954691e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.425 186180 DEBUG oslo_concurrency.lockutils [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "6b9a7a5c-0412-4863-b9d5-5de81954691e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.425 186180 DEBUG oslo_concurrency.lockutils [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6b9a7a5c-0412-4863-b9d5-5de81954691e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.426 186180 DEBUG oslo_concurrency.lockutils [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6b9a7a5c-0412-4863-b9d5-5de81954691e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.427 186180 INFO nova.compute.manager [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Terminating instance
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.429 186180 DEBUG nova.compute.manager [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:41:38 compute-0 kernel: tap599e818c-6e (unregistering): left promiscuous mode
Feb 16 17:41:38 compute-0 NetworkManager[56463]: <info>  [1771263698.4696] device (tap599e818c-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:41:38 compute-0 ovn_controller[96437]: 2026-02-16T17:41:38Z|00147|binding|INFO|Releasing lport 599e818c-6eef-46a1-9126-574260b721e3 from this chassis (sb_readonly=0)
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.477 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:38 compute-0 ovn_controller[96437]: 2026-02-16T17:41:38Z|00148|binding|INFO|Setting lport 599e818c-6eef-46a1-9126-574260b721e3 down in Southbound
Feb 16 17:41:38 compute-0 ovn_controller[96437]: 2026-02-16T17:41:38Z|00149|binding|INFO|Removing iface tap599e818c-6e ovn-installed in OVS
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.480 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.487 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:8d:c3 10.100.0.10'], port_security=['fa:16:3e:21:8d:c3 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6b9a7a5c-0412-4863-b9d5-5de81954691e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '13', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=599e818c-6eef-46a1-9126-574260b721e3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.489 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.491 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 599e818c-6eef-46a1-9126-574260b721e3 in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.494 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.511 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[5e2d2750-237f-4931-b646-952b841c309d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:38 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000012.scope: Deactivated successfully.
Feb 16 17:41:38 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000012.scope: Consumed 1.828s CPU time.
Feb 16 17:41:38 compute-0 systemd-machined[155631]: Machine qemu-14-instance-00000012 terminated.
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.542 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[d84edd74-9f24-4358-b722-a3da1961daa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.548 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[efb6a96b-5e0a-41ba-9d31-a82b508d11d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.578 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[8e52a60b-5fe1-488e-a2ad-4fe051cb673e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.595 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[28cd14f5-67c5-4736-bea0-f7e8cbc1deeb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521294, 'reachable_time': 24460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212793, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.612 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[2708e818-dbd6-4581-ad4c-30dc5dabefe9]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap94cafcd0-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 521301, 'tstamp': 521301}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212794, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap94cafcd0-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 521303, 'tstamp': 521303}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212794, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.614 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.616 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.621 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.622 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.623 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.623 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:38.624 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.701 186180 INFO nova.virt.libvirt.driver [-] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Instance destroyed successfully.
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.702 186180 DEBUG nova.objects.instance [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'resources' on Instance uuid 6b9a7a5c-0412-4863-b9d5-5de81954691e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.723 186180 DEBUG nova.virt.libvirt.vif [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T17:40:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1081268620',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1081268620',id=18,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:40:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-ct66219r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:41:35Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=6b9a7a5c-0412-4863-b9d5-5de81954691e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "599e818c-6eef-46a1-9126-574260b721e3", "address": "fa:16:3e:21:8d:c3", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap599e818c-6e", "ovs_interfaceid": "599e818c-6eef-46a1-9126-574260b721e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.724 186180 DEBUG nova.network.os_vif_util [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "599e818c-6eef-46a1-9126-574260b721e3", "address": "fa:16:3e:21:8d:c3", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap599e818c-6e", "ovs_interfaceid": "599e818c-6eef-46a1-9126-574260b721e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.725 186180 DEBUG nova.network.os_vif_util [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:21:8d:c3,bridge_name='br-int',has_traffic_filtering=True,id=599e818c-6eef-46a1-9126-574260b721e3,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap599e818c-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.725 186180 DEBUG os_vif [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:8d:c3,bridge_name='br-int',has_traffic_filtering=True,id=599e818c-6eef-46a1-9126-574260b721e3,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap599e818c-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.727 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.728 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap599e818c-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.729 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.732 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.735 186180 INFO os_vif [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:21:8d:c3,bridge_name='br-int',has_traffic_filtering=True,id=599e818c-6eef-46a1-9126-574260b721e3,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap599e818c-6e')
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.735 186180 INFO nova.virt.libvirt.driver [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Deleting instance files /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e_del
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.736 186180 INFO nova.virt.libvirt.driver [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Deletion of /var/lib/nova/instances/6b9a7a5c-0412-4863-b9d5-5de81954691e_del complete
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.785 186180 INFO nova.compute.manager [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.786 186180 DEBUG oslo.service.loopingcall [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.786 186180 DEBUG nova.compute.manager [-] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.787 186180 DEBUG nova.network.neutron [-] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:41:38 compute-0 nova_compute[186176]: 2026-02-16 17:41:38.912 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.238 186180 DEBUG nova.compute.manager [req-27c6774d-0ad3-4410-bf4c-0cfc8574e7e0 req-3be1d844-19f5-4304-bcd4-d309e4a74838 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Received event network-vif-unplugged-599e818c-6eef-46a1-9126-574260b721e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.238 186180 DEBUG oslo_concurrency.lockutils [req-27c6774d-0ad3-4410-bf4c-0cfc8574e7e0 req-3be1d844-19f5-4304-bcd4-d309e4a74838 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b9a7a5c-0412-4863-b9d5-5de81954691e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.239 186180 DEBUG oslo_concurrency.lockutils [req-27c6774d-0ad3-4410-bf4c-0cfc8574e7e0 req-3be1d844-19f5-4304-bcd4-d309e4a74838 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b9a7a5c-0412-4863-b9d5-5de81954691e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.239 186180 DEBUG oslo_concurrency.lockutils [req-27c6774d-0ad3-4410-bf4c-0cfc8574e7e0 req-3be1d844-19f5-4304-bcd4-d309e4a74838 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b9a7a5c-0412-4863-b9d5-5de81954691e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.239 186180 DEBUG nova.compute.manager [req-27c6774d-0ad3-4410-bf4c-0cfc8574e7e0 req-3be1d844-19f5-4304-bcd4-d309e4a74838 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] No waiting events found dispatching network-vif-unplugged-599e818c-6eef-46a1-9126-574260b721e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.240 186180 DEBUG nova.compute.manager [req-27c6774d-0ad3-4410-bf4c-0cfc8574e7e0 req-3be1d844-19f5-4304-bcd4-d309e4a74838 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Received event network-vif-unplugged-599e818c-6eef-46a1-9126-574260b721e3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.334 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.360 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:39.360 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:41:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:39.362 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.514 186180 DEBUG nova.network.neutron [-] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.571 186180 INFO nova.compute.manager [-] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Took 0.78 seconds to deallocate network for instance.
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.588 186180 DEBUG nova.compute.manager [req-2e105797-2f4a-4fed-933c-9d52c0bedbbb req-2c70dbbe-9c20-4449-9ac0-a5d91c79d1c3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Received event network-vif-deleted-599e818c-6eef-46a1-9126-574260b721e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.653 186180 DEBUG oslo_concurrency.lockutils [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.654 186180 DEBUG oslo_concurrency.lockutils [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.672 186180 DEBUG oslo_concurrency.lockutils [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.018s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.722 186180 INFO nova.scheduler.client.report [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Deleted allocations for instance 6b9a7a5c-0412-4863-b9d5-5de81954691e
Feb 16 17:41:39 compute-0 nova_compute[186176]: 2026-02-16 17:41:39.839 186180 DEBUG oslo_concurrency.lockutils [None req-689e58bd-c860-4be0-a637-805ed60a70d5 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6b9a7a5c-0412-4863-b9d5-5de81954691e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:40 compute-0 nova_compute[186176]: 2026-02-16 17:41:40.783 186180 DEBUG oslo_concurrency.lockutils [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:40 compute-0 nova_compute[186176]: 2026-02-16 17:41:40.784 186180 DEBUG oslo_concurrency.lockutils [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:40 compute-0 nova_compute[186176]: 2026-02-16 17:41:40.784 186180 DEBUG oslo_concurrency.lockutils [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:40 compute-0 nova_compute[186176]: 2026-02-16 17:41:40.785 186180 DEBUG oslo_concurrency.lockutils [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:40 compute-0 nova_compute[186176]: 2026-02-16 17:41:40.785 186180 DEBUG oslo_concurrency.lockutils [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:40 compute-0 nova_compute[186176]: 2026-02-16 17:41:40.787 186180 INFO nova.compute.manager [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Terminating instance
Feb 16 17:41:40 compute-0 nova_compute[186176]: 2026-02-16 17:41:40.789 186180 DEBUG nova.compute.manager [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:41:40 compute-0 kernel: tapcfacb3ed-a2 (unregistering): left promiscuous mode
Feb 16 17:41:40 compute-0 NetworkManager[56463]: <info>  [1771263700.8250] device (tapcfacb3ed-a2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:41:40 compute-0 ovn_controller[96437]: 2026-02-16T17:41:40Z|00150|binding|INFO|Releasing lport cfacb3ed-a217-4bea-ac84-0fb16e1fa1af from this chassis (sb_readonly=0)
Feb 16 17:41:40 compute-0 ovn_controller[96437]: 2026-02-16T17:41:40Z|00151|binding|INFO|Setting lport cfacb3ed-a217-4bea-ac84-0fb16e1fa1af down in Southbound
Feb 16 17:41:40 compute-0 ovn_controller[96437]: 2026-02-16T17:41:40Z|00152|binding|INFO|Removing iface tapcfacb3ed-a2 ovn-installed in OVS
Feb 16 17:41:40 compute-0 nova_compute[186176]: 2026-02-16 17:41:40.830 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:40 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:40.836 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:90:49 10.100.0.3'], port_security=['fa:16:3e:00:90:49 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6fc932c5-7baf-4d8d-a2fa-6b79e4937c72', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=cfacb3ed-a217-4bea-ac84-0fb16e1fa1af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:41:40 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:40.838 105730 INFO neutron.agent.ovn.metadata.agent [-] Port cfacb3ed-a217-4bea-ac84-0fb16e1fa1af in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:41:40 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:40.839 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:41:40 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:40.841 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[9024cfe7-b95d-405e-a023-3d6e563ef5e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:40 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:40.841 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace which is not needed anymore
Feb 16 17:41:40 compute-0 nova_compute[186176]: 2026-02-16 17:41:40.844 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:40 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Deactivated successfully.
Feb 16 17:41:40 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000011.scope: Consumed 14.172s CPU time.
Feb 16 17:41:40 compute-0 systemd-machined[155631]: Machine qemu-13-instance-00000011 terminated.
Feb 16 17:41:40 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[212374]: [NOTICE]   (212378) : haproxy version is 2.8.14-c23fe91
Feb 16 17:41:40 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[212374]: [NOTICE]   (212378) : path to executable is /usr/sbin/haproxy
Feb 16 17:41:40 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[212374]: [WARNING]  (212378) : Exiting Master process...
Feb 16 17:41:40 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[212374]: [WARNING]  (212378) : Exiting Master process...
Feb 16 17:41:40 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[212374]: [ALERT]    (212378) : Current worker (212380) exited with code 143 (Terminated)
Feb 16 17:41:40 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[212374]: [WARNING]  (212378) : All workers exited. Exiting... (0)
Feb 16 17:41:40 compute-0 systemd[1]: libpod-5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436.scope: Deactivated successfully.
Feb 16 17:41:41 compute-0 podman[212836]: 2026-02-16 17:41:41.005815391 +0000 UTC m=+0.055137292 container died 5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 17:41:41 compute-0 NetworkManager[56463]: <info>  [1771263701.0080] manager: (tapcfacb3ed-a2): new Tun device (/org/freedesktop/NetworkManager/Devices/66)
Feb 16 17:41:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436-userdata-shm.mount: Deactivated successfully.
Feb 16 17:41:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-1c2cd5dddad42fb9311bfd638a0e1d0a93e04ec3c63edbb74d5e78d4d8ba50e5-merged.mount: Deactivated successfully.
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.049 186180 INFO nova.virt.libvirt.driver [-] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Instance destroyed successfully.
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.049 186180 DEBUG nova.objects.instance [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'resources' on Instance uuid 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:41:41 compute-0 podman[212836]: 2026-02-16 17:41:41.057128959 +0000 UTC m=+0.106450820 container cleanup 5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 17:41:41 compute-0 systemd[1]: libpod-conmon-5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436.scope: Deactivated successfully.
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.066 186180 DEBUG nova.virt.libvirt.vif [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:40:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2143366756',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2143366756',id=17,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:40:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-12tqpd9i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:40:17Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=6fc932c5-7baf-4d8d-a2fa-6b79e4937c72,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.067 186180 DEBUG nova.network.os_vif_util [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "address": "fa:16:3e:00:90:49", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcfacb3ed-a2", "ovs_interfaceid": "cfacb3ed-a217-4bea-ac84-0fb16e1fa1af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.068 186180 DEBUG nova.network.os_vif_util [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:00:90:49,bridge_name='br-int',has_traffic_filtering=True,id=cfacb3ed-a217-4bea-ac84-0fb16e1fa1af,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfacb3ed-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.068 186180 DEBUG os_vif [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:00:90:49,bridge_name='br-int',has_traffic_filtering=True,id=cfacb3ed-a217-4bea-ac84-0fb16e1fa1af,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfacb3ed-a2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.071 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.071 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcfacb3ed-a2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.073 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.075 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.078 186180 INFO os_vif [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:00:90:49,bridge_name='br-int',has_traffic_filtering=True,id=cfacb3ed-a217-4bea-ac84-0fb16e1fa1af,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcfacb3ed-a2')
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.080 186180 INFO nova.virt.libvirt.driver [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Deleting instance files /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72_del
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.081 186180 INFO nova.virt.libvirt.driver [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Deletion of /var/lib/nova/instances/6fc932c5-7baf-4d8d-a2fa-6b79e4937c72_del complete
Feb 16 17:41:41 compute-0 podman[212882]: 2026-02-16 17:41:41.127480897 +0000 UTC m=+0.049461795 container remove 5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 16 17:41:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:41.133 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[8b79d67c-f3a4-4bc8-9625-af1b4c21c66c]: (4, ('Mon Feb 16 05:41:40 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436)\n5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436\nMon Feb 16 05:41:41 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436)\n5dfb0a85c531d1df2db72cd282ce9ce5ab9f9559bf4a089dd8512b90cda0e436\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:41.135 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[db31f377-dcbe-4e12-ab98-fe8ee4001275]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:41.137 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:41 compute-0 kernel: tap94cafcd0-c0: left promiscuous mode
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.141 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.146 186180 INFO nova.compute.manager [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.147 186180 DEBUG oslo.service.loopingcall [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.147 186180 DEBUG nova.compute.manager [-] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.148 186180 DEBUG nova.network.neutron [-] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:41:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:41.150 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5edc00-ad22-46cd-886f-cba6ae2fed94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.151 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:41.164 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f00baa2f-974a-4905-bf93-0cb5a465e914]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:41.166 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f9bb8bbd-4ce2-406e-af8b-c3b23615a3c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:41.180 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[7d1189c6-a17b-4c7f-ad8c-221ed4fc7021]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 521288, 'reachable_time': 31592, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212897, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:41.182 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:41:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:41.183 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[1127d3ab-b425-4516-9838-3edfe1bc5f41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:41:41 compute-0 systemd[1]: run-netns-ovnmeta\x2d94cafcd0\x2dc7c2\x2d48b4\x2da2dd\x2d21c16ce48dc4.mount: Deactivated successfully.
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.302 186180 DEBUG nova.compute.manager [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Received event network-vif-plugged-599e818c-6eef-46a1-9126-574260b721e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.303 186180 DEBUG oslo_concurrency.lockutils [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b9a7a5c-0412-4863-b9d5-5de81954691e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.303 186180 DEBUG oslo_concurrency.lockutils [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b9a7a5c-0412-4863-b9d5-5de81954691e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.303 186180 DEBUG oslo_concurrency.lockutils [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b9a7a5c-0412-4863-b9d5-5de81954691e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.304 186180 DEBUG nova.compute.manager [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] No waiting events found dispatching network-vif-plugged-599e818c-6eef-46a1-9126-574260b721e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.304 186180 WARNING nova.compute.manager [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Received unexpected event network-vif-plugged-599e818c-6eef-46a1-9126-574260b721e3 for instance with vm_state deleted and task_state None.
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.304 186180 DEBUG nova.compute.manager [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Received event network-vif-unplugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.305 186180 DEBUG oslo_concurrency.lockutils [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.305 186180 DEBUG oslo_concurrency.lockutils [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.305 186180 DEBUG oslo_concurrency.lockutils [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.306 186180 DEBUG nova.compute.manager [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] No waiting events found dispatching network-vif-unplugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.306 186180 DEBUG nova.compute.manager [req-77d93621-3def-4e20-905b-ac0035296c75 req-5d85e1c1-a0d5-4ce5-99c7-ca99db53b314 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Received event network-vif-unplugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.626 186180 DEBUG nova.network.neutron [-] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.642 186180 INFO nova.compute.manager [-] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Took 0.49 seconds to deallocate network for instance.
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.689 186180 DEBUG oslo_concurrency.lockutils [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.689 186180 DEBUG oslo_concurrency.lockutils [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.741 186180 DEBUG nova.compute.provider_tree [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.758 186180 DEBUG nova.scheduler.client.report [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.803 186180 DEBUG oslo_concurrency.lockutils [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:41 compute-0 nova_compute[186176]: 2026-02-16 17:41:41.832 186180 INFO nova.scheduler.client.report [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Deleted allocations for instance 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72
Feb 16 17:41:42 compute-0 nova_compute[186176]: 2026-02-16 17:41:42.076 186180 DEBUG oslo_concurrency.lockutils [None req-6d1a0473-15d1-48f3-aab0-9258d4f7964f c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:41:42.364 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:41:43 compute-0 nova_compute[186176]: 2026-02-16 17:41:43.387 186180 DEBUG nova.compute.manager [req-6b4008ab-c6cc-42df-a41c-f225e36027d4 req-896a62e5-6771-4909-b8f8-f6af7e2adb4f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Received event network-vif-plugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:41:43 compute-0 nova_compute[186176]: 2026-02-16 17:41:43.387 186180 DEBUG oslo_concurrency.lockutils [req-6b4008ab-c6cc-42df-a41c-f225e36027d4 req-896a62e5-6771-4909-b8f8-f6af7e2adb4f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:43 compute-0 nova_compute[186176]: 2026-02-16 17:41:43.387 186180 DEBUG oslo_concurrency.lockutils [req-6b4008ab-c6cc-42df-a41c-f225e36027d4 req-896a62e5-6771-4909-b8f8-f6af7e2adb4f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:43 compute-0 nova_compute[186176]: 2026-02-16 17:41:43.388 186180 DEBUG oslo_concurrency.lockutils [req-6b4008ab-c6cc-42df-a41c-f225e36027d4 req-896a62e5-6771-4909-b8f8-f6af7e2adb4f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6fc932c5-7baf-4d8d-a2fa-6b79e4937c72-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:43 compute-0 nova_compute[186176]: 2026-02-16 17:41:43.388 186180 DEBUG nova.compute.manager [req-6b4008ab-c6cc-42df-a41c-f225e36027d4 req-896a62e5-6771-4909-b8f8-f6af7e2adb4f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] No waiting events found dispatching network-vif-plugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:41:43 compute-0 nova_compute[186176]: 2026-02-16 17:41:43.388 186180 WARNING nova.compute.manager [req-6b4008ab-c6cc-42df-a41c-f225e36027d4 req-896a62e5-6771-4909-b8f8-f6af7e2adb4f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Received unexpected event network-vif-plugged-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af for instance with vm_state deleted and task_state None.
Feb 16 17:41:43 compute-0 nova_compute[186176]: 2026-02-16 17:41:43.388 186180 DEBUG nova.compute.manager [req-6b4008ab-c6cc-42df-a41c-f225e36027d4 req-896a62e5-6771-4909-b8f8-f6af7e2adb4f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Received event network-vif-deleted-cfacb3ed-a217-4bea-ac84-0fb16e1fa1af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:41:43 compute-0 nova_compute[186176]: 2026-02-16 17:41:43.942 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:45 compute-0 nova_compute[186176]: 2026-02-16 17:41:45.335 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:45 compute-0 nova_compute[186176]: 2026-02-16 17:41:45.336 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:41:45 compute-0 nova_compute[186176]: 2026-02-16 17:41:45.336 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:41:45 compute-0 nova_compute[186176]: 2026-02-16 17:41:45.398 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.077 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.338 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.339 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.339 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.340 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.551 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.552 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5802MB free_disk=73.22380447387695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.552 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.553 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.602 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.603 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.661 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.678 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.700 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:41:46 compute-0 nova_compute[186176]: 2026-02-16 17:41:46.700 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:41:47 compute-0 nova_compute[186176]: 2026-02-16 17:41:47.701 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:48 compute-0 nova_compute[186176]: 2026-02-16 17:41:48.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:48 compute-0 nova_compute[186176]: 2026-02-16 17:41:48.994 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:50 compute-0 nova_compute[186176]: 2026-02-16 17:41:50.336 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:50 compute-0 nova_compute[186176]: 2026-02-16 17:41:50.336 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:41:51 compute-0 nova_compute[186176]: 2026-02-16 17:41:51.082 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:51 compute-0 nova_compute[186176]: 2026-02-16 17:41:51.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:52 compute-0 nova_compute[186176]: 2026-02-16 17:41:52.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:53 compute-0 nova_compute[186176]: 2026-02-16 17:41:53.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:53 compute-0 nova_compute[186176]: 2026-02-16 17:41:53.699 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771263698.697527, 6b9a7a5c-0412-4863-b9d5-5de81954691e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:41:53 compute-0 nova_compute[186176]: 2026-02-16 17:41:53.700 186180 INFO nova.compute.manager [-] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] VM Stopped (Lifecycle Event)
Feb 16 17:41:53 compute-0 nova_compute[186176]: 2026-02-16 17:41:53.719 186180 DEBUG nova.compute.manager [None req-72d34920-91da-4979-bcea-42f6b6f23ffd - - - - - -] [instance: 6b9a7a5c-0412-4863-b9d5-5de81954691e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:41:53 compute-0 nova_compute[186176]: 2026-02-16 17:41:53.997 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:54 compute-0 podman[212899]: 2026-02-16 17:41:54.116917475 +0000 UTC m=+0.075817441 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter)
Feb 16 17:41:54 compute-0 nova_compute[186176]: 2026-02-16 17:41:54.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:54 compute-0 nova_compute[186176]: 2026-02-16 17:41:54.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:54 compute-0 nova_compute[186176]: 2026-02-16 17:41:54.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 17:41:56 compute-0 nova_compute[186176]: 2026-02-16 17:41:56.047 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771263701.0462294, 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:41:56 compute-0 nova_compute[186176]: 2026-02-16 17:41:56.048 186180 INFO nova.compute.manager [-] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] VM Stopped (Lifecycle Event)
Feb 16 17:41:56 compute-0 nova_compute[186176]: 2026-02-16 17:41:56.072 186180 DEBUG nova.compute.manager [None req-07c96d29-8f04-4588-addd-4fee465f7d8f - - - - - -] [instance: 6fc932c5-7baf-4d8d-a2fa-6b79e4937c72] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:41:56 compute-0 nova_compute[186176]: 2026-02-16 17:41:56.085 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:56 compute-0 podman[212920]: 2026-02-16 17:41:56.115222288 +0000 UTC m=+0.080226387 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 16 17:41:58 compute-0 nova_compute[186176]: 2026-02-16 17:41:58.332 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:41:59 compute-0 nova_compute[186176]: 2026-02-16 17:41:59.000 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:41:59 compute-0 podman[195505]: time="2026-02-16T17:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:41:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:41:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Feb 16 17:42:01 compute-0 nova_compute[186176]: 2026-02-16 17:42:01.089 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:01 compute-0 openstack_network_exporter[198360]: ERROR   17:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:42:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:42:01 compute-0 openstack_network_exporter[198360]: ERROR   17:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:42:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:42:02 compute-0 podman[212941]: 2026-02-16 17:42:02.092519204 +0000 UTC m=+0.062917160 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:42:02 compute-0 podman[212940]: 2026-02-16 17:42:02.119486265 +0000 UTC m=+0.091953491 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 17:42:04 compute-0 nova_compute[186176]: 2026-02-16 17:42:04.047 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:06 compute-0 nova_compute[186176]: 2026-02-16 17:42:06.093 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:09 compute-0 nova_compute[186176]: 2026-02-16 17:42:09.049 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:11 compute-0 nova_compute[186176]: 2026-02-16 17:42:11.097 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:11 compute-0 ovn_controller[96437]: 2026-02-16T17:42:11Z|00153|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Feb 16 17:42:14 compute-0 nova_compute[186176]: 2026-02-16 17:42:14.088 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:16 compute-0 nova_compute[186176]: 2026-02-16 17:42:16.101 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:19 compute-0 nova_compute[186176]: 2026-02-16 17:42:19.137 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:21 compute-0 nova_compute[186176]: 2026-02-16 17:42:21.106 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:24 compute-0 nova_compute[186176]: 2026-02-16 17:42:24.139 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:25 compute-0 podman[212991]: 2026-02-16 17:42:25.10746293 +0000 UTC m=+0.070435561 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Feb 16 17:42:26 compute-0 nova_compute[186176]: 2026-02-16 17:42:26.109 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:27 compute-0 podman[213012]: 2026-02-16 17:42:27.09401701 +0000 UTC m=+0.061553706 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 16 17:42:29 compute-0 nova_compute[186176]: 2026-02-16 17:42:29.142 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:29 compute-0 podman[195505]: time="2026-02-16T17:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:42:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:42:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2173 "" "Go-http-client/1.1"
Feb 16 17:42:31 compute-0 nova_compute[186176]: 2026-02-16 17:42:31.113 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:31 compute-0 openstack_network_exporter[198360]: ERROR   17:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:42:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:42:31 compute-0 openstack_network_exporter[198360]: ERROR   17:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:42:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:42:33 compute-0 podman[213032]: 2026-02-16 17:42:33.110823629 +0000 UTC m=+0.070225886 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:42:33 compute-0 podman[213031]: 2026-02-16 17:42:33.142418012 +0000 UTC m=+0.108844318 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 17:42:34 compute-0 nova_compute[186176]: 2026-02-16 17:42:34.188 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:36 compute-0 nova_compute[186176]: 2026-02-16 17:42:36.117 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:36 compute-0 nova_compute[186176]: 2026-02-16 17:42:36.620 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:36 compute-0 nova_compute[186176]: 2026-02-16 17:42:36.621 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:36 compute-0 nova_compute[186176]: 2026-02-16 17:42:36.647 186180 DEBUG nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:42:36 compute-0 nova_compute[186176]: 2026-02-16 17:42:36.764 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:36 compute-0 nova_compute[186176]: 2026-02-16 17:42:36.765 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:36 compute-0 nova_compute[186176]: 2026-02-16 17:42:36.779 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:42:36 compute-0 nova_compute[186176]: 2026-02-16 17:42:36.780 186180 INFO nova.compute.claims [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.041 186180 DEBUG nova.compute.provider_tree [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.070 186180 DEBUG nova.scheduler.client.report [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.098 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.098 186180 DEBUG nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.167 186180 DEBUG nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.168 186180 DEBUG nova.network.neutron [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.214 186180 INFO nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.239 186180 DEBUG nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.380 186180 DEBUG nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.383 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.383 186180 INFO nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Creating image(s)
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.384 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "/var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.384 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.385 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.414 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.482 186180 DEBUG nova.policy [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54934f49b2044289bcf127662fe114b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.489 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.490 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.491 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.512 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.577 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.579 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.615 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.616 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.617 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.682 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.683 186180 DEBUG nova.virt.disk.api [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Checking if we can resize image /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.684 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.764 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.766 186180 DEBUG nova.virt.disk.api [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Cannot resize image /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.767 186180 DEBUG nova.objects.instance [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'migration_context' on Instance uuid b704e6db-26a3-4e50-a981-c2e7f6d427f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.795 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.795 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Ensure instance console log exists: /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.796 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.796 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:37 compute-0 nova_compute[186176]: 2026-02-16 17:42:37.796 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:38.174 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:38.175 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:38.175 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:38 compute-0 nova_compute[186176]: 2026-02-16 17:42:38.219 186180 DEBUG nova.network.neutron [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Successfully created port: 749dcde6-e031-42ae-bf1f-5aa5824cd2c9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:42:39 compute-0 nova_compute[186176]: 2026-02-16 17:42:39.232 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:39 compute-0 nova_compute[186176]: 2026-02-16 17:42:39.845 186180 DEBUG nova.network.neutron [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Successfully updated port: 749dcde6-e031-42ae-bf1f-5aa5824cd2c9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:42:40 compute-0 nova_compute[186176]: 2026-02-16 17:42:40.023 186180 DEBUG nova.compute.manager [req-e647c37d-9bf6-4315-bb9e-3c61bf87ba51 req-cb2a15c3-f59a-4ef8-ac60-57e9edd84694 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Received event network-changed-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:42:40 compute-0 nova_compute[186176]: 2026-02-16 17:42:40.023 186180 DEBUG nova.compute.manager [req-e647c37d-9bf6-4315-bb9e-3c61bf87ba51 req-cb2a15c3-f59a-4ef8-ac60-57e9edd84694 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Refreshing instance network info cache due to event network-changed-749dcde6-e031-42ae-bf1f-5aa5824cd2c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:42:40 compute-0 nova_compute[186176]: 2026-02-16 17:42:40.024 186180 DEBUG oslo_concurrency.lockutils [req-e647c37d-9bf6-4315-bb9e-3c61bf87ba51 req-cb2a15c3-f59a-4ef8-ac60-57e9edd84694 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-b704e6db-26a3-4e50-a981-c2e7f6d427f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:42:40 compute-0 nova_compute[186176]: 2026-02-16 17:42:40.024 186180 DEBUG oslo_concurrency.lockutils [req-e647c37d-9bf6-4315-bb9e-3c61bf87ba51 req-cb2a15c3-f59a-4ef8-ac60-57e9edd84694 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-b704e6db-26a3-4e50-a981-c2e7f6d427f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:42:40 compute-0 nova_compute[186176]: 2026-02-16 17:42:40.024 186180 DEBUG nova.network.neutron [req-e647c37d-9bf6-4315-bb9e-3c61bf87ba51 req-cb2a15c3-f59a-4ef8-ac60-57e9edd84694 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Refreshing network info cache for port 749dcde6-e031-42ae-bf1f-5aa5824cd2c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:42:40 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:40.025 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:42:40 compute-0 nova_compute[186176]: 2026-02-16 17:42:40.026 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:40 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:40.026 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:42:40 compute-0 nova_compute[186176]: 2026-02-16 17:42:40.036 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "refresh_cache-b704e6db-26a3-4e50-a981-c2e7f6d427f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:42:41 compute-0 nova_compute[186176]: 2026-02-16 17:42:41.103 186180 DEBUG nova.network.neutron [req-e647c37d-9bf6-4315-bb9e-3c61bf87ba51 req-cb2a15c3-f59a-4ef8-ac60-57e9edd84694 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:42:41 compute-0 nova_compute[186176]: 2026-02-16 17:42:41.120 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:43 compute-0 nova_compute[186176]: 2026-02-16 17:42:43.125 186180 DEBUG nova.network.neutron [req-e647c37d-9bf6-4315-bb9e-3c61bf87ba51 req-cb2a15c3-f59a-4ef8-ac60-57e9edd84694 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:42:43 compute-0 nova_compute[186176]: 2026-02-16 17:42:43.151 186180 DEBUG oslo_concurrency.lockutils [req-e647c37d-9bf6-4315-bb9e-3c61bf87ba51 req-cb2a15c3-f59a-4ef8-ac60-57e9edd84694 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-b704e6db-26a3-4e50-a981-c2e7f6d427f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:42:43 compute-0 nova_compute[186176]: 2026-02-16 17:42:43.152 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquired lock "refresh_cache-b704e6db-26a3-4e50-a981-c2e7f6d427f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:42:43 compute-0 nova_compute[186176]: 2026-02-16 17:42:43.152 186180 DEBUG nova.network.neutron [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:42:43 compute-0 nova_compute[186176]: 2026-02-16 17:42:43.352 186180 DEBUG nova.network.neutron [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:42:44 compute-0 nova_compute[186176]: 2026-02-16 17:42:44.270 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.163 186180 DEBUG nova.network.neutron [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Updating instance_info_cache with network_info: [{"id": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "address": "fa:16:3e:81:a1:a7", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap749dcde6-e0", "ovs_interfaceid": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.218 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Releasing lock "refresh_cache-b704e6db-26a3-4e50-a981-c2e7f6d427f2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.219 186180 DEBUG nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Instance network_info: |[{"id": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "address": "fa:16:3e:81:a1:a7", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap749dcde6-e0", "ovs_interfaceid": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.222 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Start _get_guest_xml network_info=[{"id": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "address": "fa:16:3e:81:a1:a7", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap749dcde6-e0", "ovs_interfaceid": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.227 186180 WARNING nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.234 186180 DEBUG nova.virt.libvirt.host [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.235 186180 DEBUG nova.virt.libvirt.host [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.238 186180 DEBUG nova.virt.libvirt.host [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.239 186180 DEBUG nova.virt.libvirt.host [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.240 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.241 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.241 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.242 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.242 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.242 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.242 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.243 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.243 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.243 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.244 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.244 186180 DEBUG nova.virt.hardware [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.249 186180 DEBUG nova.virt.libvirt.vif [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:42:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2002175474',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2002175474',id=20,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-9fax3nrp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:42:37Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=b704e6db-26a3-4e50-a981-c2e7f6d427f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "address": "fa:16:3e:81:a1:a7", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap749dcde6-e0", "ovs_interfaceid": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.249 186180 DEBUG nova.network.os_vif_util [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "address": "fa:16:3e:81:a1:a7", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap749dcde6-e0", "ovs_interfaceid": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.250 186180 DEBUG nova.network.os_vif_util [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:a1:a7,bridge_name='br-int',has_traffic_filtering=True,id=749dcde6-e031-42ae-bf1f-5aa5824cd2c9,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap749dcde6-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.251 186180 DEBUG nova.objects.instance [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'pci_devices' on Instance uuid b704e6db-26a3-4e50-a981-c2e7f6d427f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.281 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <uuid>b704e6db-26a3-4e50-a981-c2e7f6d427f2</uuid>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <name>instance-00000014</name>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteStrategies-server-2002175474</nova:name>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:42:45</nova:creationTime>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:42:45 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:42:45 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:42:45 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:42:45 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:42:45 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:42:45 compute-0 nova_compute[186176]:         <nova:user uuid="c54934f49b2044289bcf127662fe114b">tempest-TestExecuteStrategies-1098930400-project-member</nova:user>
Feb 16 17:42:45 compute-0 nova_compute[186176]:         <nova:project uuid="1a237c4b00c5426cb1dc6afe3c7c868c">tempest-TestExecuteStrategies-1098930400</nova:project>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:42:45 compute-0 nova_compute[186176]:         <nova:port uuid="749dcde6-e031-42ae-bf1f-5aa5824cd2c9">
Feb 16 17:42:45 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <system>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <entry name="serial">b704e6db-26a3-4e50-a981-c2e7f6d427f2</entry>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <entry name="uuid">b704e6db-26a3-4e50-a981-c2e7f6d427f2</entry>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     </system>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <os>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   </os>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <features>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   </features>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk.config"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:81:a1:a7"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <target dev="tap749dcde6-e0"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/console.log" append="off"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <video>
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     </video>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:42:45 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:42:45 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:42:45 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:42:45 compute-0 nova_compute[186176]: </domain>
Feb 16 17:42:45 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.282 186180 DEBUG nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Preparing to wait for external event network-vif-plugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.283 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.284 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.284 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.285 186180 DEBUG nova.virt.libvirt.vif [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:42:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2002175474',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2002175474',id=20,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-9fax3nrp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:42:37Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=b704e6db-26a3-4e50-a981-c2e7f6d427f2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "address": "fa:16:3e:81:a1:a7", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap749dcde6-e0", "ovs_interfaceid": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.286 186180 DEBUG nova.network.os_vif_util [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "address": "fa:16:3e:81:a1:a7", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap749dcde6-e0", "ovs_interfaceid": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.287 186180 DEBUG nova.network.os_vif_util [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:a1:a7,bridge_name='br-int',has_traffic_filtering=True,id=749dcde6-e031-42ae-bf1f-5aa5824cd2c9,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap749dcde6-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.287 186180 DEBUG os_vif [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:a1:a7,bridge_name='br-int',has_traffic_filtering=True,id=749dcde6-e031-42ae-bf1f-5aa5824cd2c9,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap749dcde6-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.288 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.289 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.289 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.295 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.295 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap749dcde6-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.296 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap749dcde6-e0, col_values=(('external_ids', {'iface-id': '749dcde6-e031-42ae-bf1f-5aa5824cd2c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:81:a1:a7', 'vm-uuid': 'b704e6db-26a3-4e50-a981-c2e7f6d427f2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.335 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:45 compute-0 NetworkManager[56463]: <info>  [1771263765.3357] manager: (tap749dcde6-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.339 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.343 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.345 186180 INFO os_vif [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:a1:a7,bridge_name='br-int',has_traffic_filtering=True,id=749dcde6-e031-42ae-bf1f-5aa5824cd2c9,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap749dcde6-e0')
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.350 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.351 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.440 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.442 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.442 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No VIF found with MAC fa:16:3e:81:a1:a7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:42:45 compute-0 nova_compute[186176]: 2026-02-16 17:42:45.443 186180 INFO nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Using config drive
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.180 186180 INFO nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Creating config drive at /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk.config
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.187 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9r_6xho3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.309 186180 DEBUG oslo_concurrency.processutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp9r_6xho3" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:42:46 compute-0 kernel: tap749dcde6-e0: entered promiscuous mode
Feb 16 17:42:46 compute-0 NetworkManager[56463]: <info>  [1771263766.3623] manager: (tap749dcde6-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Feb 16 17:42:46 compute-0 systemd-udevd[213114]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.396 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:46 compute-0 ovn_controller[96437]: 2026-02-16T17:42:46Z|00154|binding|INFO|Claiming lport 749dcde6-e031-42ae-bf1f-5aa5824cd2c9 for this chassis.
Feb 16 17:42:46 compute-0 ovn_controller[96437]: 2026-02-16T17:42:46Z|00155|binding|INFO|749dcde6-e031-42ae-bf1f-5aa5824cd2c9: Claiming fa:16:3e:81:a1:a7 10.100.0.8
Feb 16 17:42:46 compute-0 ovn_controller[96437]: 2026-02-16T17:42:46Z|00156|binding|INFO|Setting lport 749dcde6-e031-42ae-bf1f-5aa5824cd2c9 ovn-installed in OVS
Feb 16 17:42:46 compute-0 ovn_controller[96437]: 2026-02-16T17:42:46Z|00157|binding|INFO|Setting lport 749dcde6-e031-42ae-bf1f-5aa5824cd2c9 up in Southbound
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.407 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:46 compute-0 NetworkManager[56463]: <info>  [1771263766.4103] device (tap749dcde6-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:42:46 compute-0 NetworkManager[56463]: <info>  [1771263766.4109] device (tap749dcde6-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.410 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:a1:a7 10.100.0.8'], port_security=['fa:16:3e:81:a1:a7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b704e6db-26a3-4e50-a981-c2e7f6d427f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=749dcde6-e031-42ae-bf1f-5aa5824cd2c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.411 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 749dcde6-e031-42ae-bf1f-5aa5824cd2c9 in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 bound to our chassis
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.413 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:42:46 compute-0 systemd-machined[155631]: New machine qemu-15-instance-00000014.
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.425 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb7fda8-627e-4b07-9c07-0e5f93b15dc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.426 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap94cafcd0-c1 in ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.428 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap94cafcd0-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.428 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[2e3462ba-d17b-4dab-b070-5948c4edbeed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.430 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[5431d330-62a2-4ab9-bb79-42fe13d061a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000014.
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.443 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[8fbe56db-0589-4487-9eee-c12370a1cc24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.468 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[7f72765d-bc02-4105-99d3-abb5c84cc018]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.496 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[108c9709-a9fa-472e-929e-81d8b0e032d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.501 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[83bacd28-194f-484a-8367-43fc2624d188]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 NetworkManager[56463]: <info>  [1771263766.5031] manager: (tap94cafcd0-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.525 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0f43a8-1323-4e0b-8f99-1413230ea47e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.529 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[768459ff-5d61-4ffc-85a8-689359d06900]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 NetworkManager[56463]: <info>  [1771263766.5493] device (tap94cafcd0-c0): carrier: link connected
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.554 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[469ec031-bd26-4819-a231-b4ec76c41569]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.570 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[338104bd-3bb9-4eec-80ee-498119c20ef7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536207, 'reachable_time': 32285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213150, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.586 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[97327e5a-a01f-490c-9905-7af523e1de82]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536207, 'tstamp': 536207}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213151, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.605 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[69f3ec8c-ca6d-461b-951c-b6a9eb968dcd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536207, 'reachable_time': 32285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213152, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.634 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b679b7c8-4794-48fe-9e41-47da21235f8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.670 186180 DEBUG nova.compute.manager [req-b045a855-a0f0-4471-adc6-68a5eb0e3cff req-4c26896d-9d29-4bdd-a9f7-8fcbd3363468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Received event network-vif-plugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.671 186180 DEBUG oslo_concurrency.lockutils [req-b045a855-a0f0-4471-adc6-68a5eb0e3cff req-4c26896d-9d29-4bdd-a9f7-8fcbd3363468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.671 186180 DEBUG oslo_concurrency.lockutils [req-b045a855-a0f0-4471-adc6-68a5eb0e3cff req-4c26896d-9d29-4bdd-a9f7-8fcbd3363468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.671 186180 DEBUG oslo_concurrency.lockutils [req-b045a855-a0f0-4471-adc6-68a5eb0e3cff req-4c26896d-9d29-4bdd-a9f7-8fcbd3363468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.671 186180 DEBUG nova.compute.manager [req-b045a855-a0f0-4471-adc6-68a5eb0e3cff req-4c26896d-9d29-4bdd-a9f7-8fcbd3363468 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Processing event network-vif-plugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.690 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e083a421-3cca-4c8d-81fa-74ba09cc4bfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.692 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.692 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.693 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.695 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:46 compute-0 NetworkManager[56463]: <info>  [1771263766.6963] manager: (tap94cafcd0-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Feb 16 17:42:46 compute-0 kernel: tap94cafcd0-c0: entered promiscuous mode
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.698 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.699 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.700 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.701 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:46 compute-0 ovn_controller[96437]: 2026-02-16T17:42:46Z|00158|binding|INFO|Releasing lport 5c28d585-b48c-40c6-b5e7-f1e59317b2de from this chassis (sb_readonly=0)
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.701 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.702 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[c6a8cd33-621f-44c4-920b-7cc259af9d30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.703 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:42:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:46.704 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'env', 'PROCESS_TAG=haproxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.707 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.982 186180 DEBUG nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.983 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263766.981312, b704e6db-26a3-4e50-a981-c2e7f6d427f2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.984 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] VM Started (Lifecycle Event)
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.989 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.993 186180 INFO nova.virt.libvirt.driver [-] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Instance spawned successfully.
Feb 16 17:42:46 compute-0 nova_compute[186176]: 2026-02-16 17:42:46.995 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.066 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.074 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.079 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.079 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.080 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.081 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.081 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.082 186180 DEBUG nova.virt.libvirt.driver [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.115 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.116 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263766.9828691, b704e6db-26a3-4e50-a981-c2e7f6d427f2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.116 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] VM Paused (Lifecycle Event)
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.149 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.152 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263766.988925, b704e6db-26a3-4e50-a981-c2e7f6d427f2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.153 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] VM Resumed (Lifecycle Event)
Feb 16 17:42:47 compute-0 podman[213192]: 2026-02-16 17:42:47.161379651 +0000 UTC m=+0.094123013 container create 929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.165 186180 INFO nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Took 9.78 seconds to spawn the instance on the hypervisor.
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.165 186180 DEBUG nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.178 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.182 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:42:47 compute-0 podman[213192]: 2026-02-16 17:42:47.115313599 +0000 UTC m=+0.048057051 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:42:47 compute-0 systemd[1]: Started libpod-conmon-929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3.scope.
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.214 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:42:47 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:42:47 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c7166f49b29002ffd959ef8ada4ce17b31ed6fa820710b87559578e1c934d2e2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.250 186180 INFO nova.compute.manager [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Took 10.53 seconds to build instance.
Feb 16 17:42:47 compute-0 podman[213192]: 2026-02-16 17:42:47.26365969 +0000 UTC m=+0.196403112 container init 929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:42:47 compute-0 podman[213192]: 2026-02-16 17:42:47.269129572 +0000 UTC m=+0.201872944 container start 929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 16 17:42:47 compute-0 nova_compute[186176]: 2026-02-16 17:42:47.281 186180 DEBUG oslo_concurrency.lockutils [None req-400b5db1-f268-41c0-8b58-31e6f8d7b6be c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:47 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213207]: [NOTICE]   (213211) : New worker (213213) forked
Feb 16 17:42:47 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213207]: [NOTICE]   (213211) : Loading success.
Feb 16 17:42:48 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:42:48.028 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.351 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.352 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.352 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.353 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.461 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.536 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.537 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.620 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.777 186180 DEBUG nova.compute.manager [req-9a760f99-f038-46da-be1c-78bdcdba8dfc req-03f71f9d-b3be-44cb-a202-a6af6d24c5a9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Received event network-vif-plugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.778 186180 DEBUG oslo_concurrency.lockutils [req-9a760f99-f038-46da-be1c-78bdcdba8dfc req-03f71f9d-b3be-44cb-a202-a6af6d24c5a9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.779 186180 DEBUG oslo_concurrency.lockutils [req-9a760f99-f038-46da-be1c-78bdcdba8dfc req-03f71f9d-b3be-44cb-a202-a6af6d24c5a9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.780 186180 DEBUG oslo_concurrency.lockutils [req-9a760f99-f038-46da-be1c-78bdcdba8dfc req-03f71f9d-b3be-44cb-a202-a6af6d24c5a9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.780 186180 DEBUG nova.compute.manager [req-9a760f99-f038-46da-be1c-78bdcdba8dfc req-03f71f9d-b3be-44cb-a202-a6af6d24c5a9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] No waiting events found dispatching network-vif-plugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.781 186180 WARNING nova.compute.manager [req-9a760f99-f038-46da-be1c-78bdcdba8dfc req-03f71f9d-b3be-44cb-a202-a6af6d24c5a9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Received unexpected event network-vif-plugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 for instance with vm_state active and task_state None.
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.841 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.844 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5742MB free_disk=73.22290802001953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.845 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.845 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.966 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance b704e6db-26a3-4e50-a981-c2e7f6d427f2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.966 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:42:48 compute-0 nova_compute[186176]: 2026-02-16 17:42:48.967 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:42:49 compute-0 nova_compute[186176]: 2026-02-16 17:42:49.013 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:42:49 compute-0 nova_compute[186176]: 2026-02-16 17:42:49.036 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:42:49 compute-0 nova_compute[186176]: 2026-02-16 17:42:49.075 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:42:49 compute-0 nova_compute[186176]: 2026-02-16 17:42:49.075 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:42:49 compute-0 nova_compute[186176]: 2026-02-16 17:42:49.273 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:50 compute-0 nova_compute[186176]: 2026-02-16 17:42:50.336 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:52 compute-0 nova_compute[186176]: 2026-02-16 17:42:52.071 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:42:52 compute-0 nova_compute[186176]: 2026-02-16 17:42:52.141 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:42:52 compute-0 nova_compute[186176]: 2026-02-16 17:42:52.143 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:42:52 compute-0 nova_compute[186176]: 2026-02-16 17:42:52.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:42:53 compute-0 nova_compute[186176]: 2026-02-16 17:42:53.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:42:54 compute-0 nova_compute[186176]: 2026-02-16 17:42:54.278 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:54 compute-0 nova_compute[186176]: 2026-02-16 17:42:54.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:42:54 compute-0 nova_compute[186176]: 2026-02-16 17:42:54.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:42:55 compute-0 nova_compute[186176]: 2026-02-16 17:42:55.339 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:56 compute-0 podman[213229]: 2026-02-16 17:42:56.125636504 +0000 UTC m=+0.092663218 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, release=1770267347, version=9.7, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public)
Feb 16 17:42:58 compute-0 podman[213260]: 2026-02-16 17:42:58.098905093 +0000 UTC m=+0.066577108 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent)
Feb 16 17:42:58 compute-0 ovn_controller[96437]: 2026-02-16T17:42:58Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:81:a1:a7 10.100.0.8
Feb 16 17:42:58 compute-0 ovn_controller[96437]: 2026-02-16T17:42:58Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:81:a1:a7 10.100.0.8
Feb 16 17:42:59 compute-0 nova_compute[186176]: 2026-02-16 17:42:59.278 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:42:59 compute-0 podman[195505]: time="2026-02-16T17:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:42:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:42:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 17:43:00 compute-0 nova_compute[186176]: 2026-02-16 17:43:00.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:43:00 compute-0 nova_compute[186176]: 2026-02-16 17:43:00.342 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:01 compute-0 openstack_network_exporter[198360]: ERROR   17:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:43:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:43:01 compute-0 openstack_network_exporter[198360]: ERROR   17:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:43:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:43:04 compute-0 podman[213282]: 2026-02-16 17:43:04.101985231 +0000 UTC m=+0.063630457 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 17:43:04 compute-0 podman[213281]: 2026-02-16 17:43:04.164952421 +0000 UTC m=+0.134252441 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Feb 16 17:43:04 compute-0 nova_compute[186176]: 2026-02-16 17:43:04.279 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:05 compute-0 nova_compute[186176]: 2026-02-16 17:43:05.373 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:09 compute-0 nova_compute[186176]: 2026-02-16 17:43:09.307 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:10 compute-0 nova_compute[186176]: 2026-02-16 17:43:10.385 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:14 compute-0 nova_compute[186176]: 2026-02-16 17:43:14.310 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:15 compute-0 nova_compute[186176]: 2026-02-16 17:43:15.389 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:16 compute-0 ovn_controller[96437]: 2026-02-16T17:43:16Z|00159|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Feb 16 17:43:19 compute-0 nova_compute[186176]: 2026-02-16 17:43:19.313 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:20 compute-0 nova_compute[186176]: 2026-02-16 17:43:20.393 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:24 compute-0 nova_compute[186176]: 2026-02-16 17:43:24.358 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:25 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 17:43:25 compute-0 nova_compute[186176]: 2026-02-16 17:43:25.432 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:27 compute-0 podman[213333]: 2026-02-16 17:43:27.119577409 +0000 UTC m=+0.088686682 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.openshift.expose-services=, managed_by=edpm_ansible, release=1770267347, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 17:43:29 compute-0 podman[213354]: 2026-02-16 17:43:29.107366579 +0000 UTC m=+0.073793202 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 16 17:43:29 compute-0 nova_compute[186176]: 2026-02-16 17:43:29.361 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:29 compute-0 podman[195505]: time="2026-02-16T17:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:43:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:43:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2644 "" "Go-http-client/1.1"
Feb 16 17:43:30 compute-0 nova_compute[186176]: 2026-02-16 17:43:30.265 186180 DEBUG nova.virt.libvirt.driver [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Creating tmpfile /var/lib/nova/instances/tmp9gzkoxin to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 17:43:30 compute-0 nova_compute[186176]: 2026-02-16 17:43:30.268 186180 DEBUG nova.compute.manager [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9gzkoxin',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 17:43:30 compute-0 nova_compute[186176]: 2026-02-16 17:43:30.437 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:31 compute-0 openstack_network_exporter[198360]: ERROR   17:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:43:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:43:31 compute-0 openstack_network_exporter[198360]: ERROR   17:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:43:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:43:31 compute-0 nova_compute[186176]: 2026-02-16 17:43:31.645 186180 DEBUG nova.compute.manager [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9gzkoxin',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='55df1d91-66af-441a-b872-56282db361ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 17:43:31 compute-0 nova_compute[186176]: 2026-02-16 17:43:31.672 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-55df1d91-66af-441a-b872-56282db361ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:43:31 compute-0 nova_compute[186176]: 2026-02-16 17:43:31.673 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-55df1d91-66af-441a-b872-56282db361ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:43:31 compute-0 nova_compute[186176]: 2026-02-16 17:43:31.674 186180 DEBUG nova.network.neutron [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.552 186180 DEBUG nova.network.neutron [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Updating instance_info_cache with network_info: [{"id": "daae7c28-fd1e-4920-aadf-67a8a8019391", "address": "fa:16:3e:bd:d9:0e", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaae7c28-fd", "ovs_interfaceid": "daae7c28-fd1e-4920-aadf-67a8a8019391", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.575 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-55df1d91-66af-441a-b872-56282db361ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.578 186180 DEBUG nova.virt.libvirt.driver [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9gzkoxin',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='55df1d91-66af-441a-b872-56282db361ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.579 186180 DEBUG nova.virt.libvirt.driver [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Creating instance directory: /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.580 186180 DEBUG nova.virt.libvirt.driver [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Creating disk.info with the contents: {'/var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk': 'qcow2', '/var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.581 186180 DEBUG nova.virt.libvirt.driver [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.581 186180 DEBUG nova.objects.instance [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 55df1d91-66af-441a-b872-56282db361ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.620 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.699 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.700 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.701 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.716 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.782 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.783 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.812 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.814 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.814 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.894 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.895 186180 DEBUG nova.virt.disk.api [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Checking if we can resize image /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.896 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.960 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.962 186180 DEBUG nova.virt.disk.api [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Cannot resize image /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:43:33 compute-0 nova_compute[186176]: 2026-02-16 17:43:33.962 186180 DEBUG nova.objects.instance [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid 55df1d91-66af-441a-b872-56282db361ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.135 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.158 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk.config 485376" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.159 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk.config to /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.160 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk.config /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.361 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.528 186180 DEBUG oslo_concurrency.processutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk.config /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef" returned: 0 in 0.369s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.530 186180 DEBUG nova.virt.libvirt.driver [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.532 186180 DEBUG nova.virt.libvirt.vif [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:42:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-264482010',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-264482010',id=19,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:42:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-1ntz2m5a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:42:23Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=55df1d91-66af-441a-b872-56282db361ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "daae7c28-fd1e-4920-aadf-67a8a8019391", "address": "fa:16:3e:bd:d9:0e", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdaae7c28-fd", "ovs_interfaceid": "daae7c28-fd1e-4920-aadf-67a8a8019391", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.533 186180 DEBUG nova.network.os_vif_util [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "daae7c28-fd1e-4920-aadf-67a8a8019391", "address": "fa:16:3e:bd:d9:0e", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapdaae7c28-fd", "ovs_interfaceid": "daae7c28-fd1e-4920-aadf-67a8a8019391", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.534 186180 DEBUG nova.network.os_vif_util [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bd:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=daae7c28-fd1e-4920-aadf-67a8a8019391,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaae7c28-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.535 186180 DEBUG os_vif [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=daae7c28-fd1e-4920-aadf-67a8a8019391,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaae7c28-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.536 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.537 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.538 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.541 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.542 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdaae7c28-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.542 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdaae7c28-fd, col_values=(('external_ids', {'iface-id': 'daae7c28-fd1e-4920-aadf-67a8a8019391', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:d9:0e', 'vm-uuid': '55df1d91-66af-441a-b872-56282db361ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.544 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:34 compute-0 NetworkManager[56463]: <info>  [1771263814.5458] manager: (tapdaae7c28-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.547 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.552 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.552 186180 INFO os_vif [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=daae7c28-fd1e-4920-aadf-67a8a8019391,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaae7c28-fd')
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.553 186180 DEBUG nova.virt.libvirt.driver [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 17:43:34 compute-0 nova_compute[186176]: 2026-02-16 17:43:34.553 186180 DEBUG nova.compute.manager [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9gzkoxin',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='55df1d91-66af-441a-b872-56282db361ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 17:43:35 compute-0 podman[213397]: 2026-02-16 17:43:35.105400178 +0000 UTC m=+0.070508583 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:43:35 compute-0 podman[213396]: 2026-02-16 17:43:35.147640558 +0000 UTC m=+0.115726185 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 16 17:43:35 compute-0 nova_compute[186176]: 2026-02-16 17:43:35.378 186180 DEBUG nova.network.neutron [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Port daae7c28-fd1e-4920-aadf-67a8a8019391 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 17:43:35 compute-0 nova_compute[186176]: 2026-02-16 17:43:35.381 186180 DEBUG nova.compute.manager [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9gzkoxin',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='55df1d91-66af-441a-b872-56282db361ef',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 17:43:35 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 17:43:35 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 17:43:35 compute-0 kernel: tapdaae7c28-fd: entered promiscuous mode
Feb 16 17:43:35 compute-0 NetworkManager[56463]: <info>  [1771263815.6955] manager: (tapdaae7c28-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Feb 16 17:43:35 compute-0 ovn_controller[96437]: 2026-02-16T17:43:35Z|00160|binding|INFO|Claiming lport daae7c28-fd1e-4920-aadf-67a8a8019391 for this additional chassis.
Feb 16 17:43:35 compute-0 ovn_controller[96437]: 2026-02-16T17:43:35Z|00161|binding|INFO|daae7c28-fd1e-4920-aadf-67a8a8019391: Claiming fa:16:3e:bd:d9:0e 10.100.0.10
Feb 16 17:43:35 compute-0 nova_compute[186176]: 2026-02-16 17:43:35.697 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:35 compute-0 ovn_controller[96437]: 2026-02-16T17:43:35Z|00162|binding|INFO|Setting lport daae7c28-fd1e-4920-aadf-67a8a8019391 ovn-installed in OVS
Feb 16 17:43:35 compute-0 nova_compute[186176]: 2026-02-16 17:43:35.706 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:35 compute-0 nova_compute[186176]: 2026-02-16 17:43:35.708 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:35 compute-0 nova_compute[186176]: 2026-02-16 17:43:35.709 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:35 compute-0 systemd-machined[155631]: New machine qemu-16-instance-00000013.
Feb 16 17:43:35 compute-0 systemd-udevd[213479]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:43:35 compute-0 NetworkManager[56463]: <info>  [1771263815.7431] device (tapdaae7c28-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:43:35 compute-0 NetworkManager[56463]: <info>  [1771263815.7444] device (tapdaae7c28-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:43:35 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000013.
Feb 16 17:43:36 compute-0 nova_compute[186176]: 2026-02-16 17:43:36.413 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263816.41239, 55df1d91-66af-441a-b872-56282db361ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:43:36 compute-0 nova_compute[186176]: 2026-02-16 17:43:36.414 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 55df1d91-66af-441a-b872-56282db361ef] VM Started (Lifecycle Event)
Feb 16 17:43:36 compute-0 nova_compute[186176]: 2026-02-16 17:43:36.435 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 55df1d91-66af-441a-b872-56282db361ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:43:37 compute-0 nova_compute[186176]: 2026-02-16 17:43:37.090 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263817.0906398, 55df1d91-66af-441a-b872-56282db361ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:43:37 compute-0 nova_compute[186176]: 2026-02-16 17:43:37.091 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 55df1d91-66af-441a-b872-56282db361ef] VM Resumed (Lifecycle Event)
Feb 16 17:43:37 compute-0 nova_compute[186176]: 2026-02-16 17:43:37.116 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 55df1d91-66af-441a-b872-56282db361ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:43:37 compute-0 nova_compute[186176]: 2026-02-16 17:43:37.120 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 55df1d91-66af-441a-b872-56282db361ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:43:37 compute-0 nova_compute[186176]: 2026-02-16 17:43:37.144 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 55df1d91-66af-441a-b872-56282db361ef] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 17:43:37 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:37.698 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:43:37 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:37.699 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:43:37 compute-0 nova_compute[186176]: 2026-02-16 17:43:37.736 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.175 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.177 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.178 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:38 compute-0 ovn_controller[96437]: 2026-02-16T17:43:38Z|00163|binding|INFO|Claiming lport daae7c28-fd1e-4920-aadf-67a8a8019391 for this chassis.
Feb 16 17:43:38 compute-0 ovn_controller[96437]: 2026-02-16T17:43:38Z|00164|binding|INFO|daae7c28-fd1e-4920-aadf-67a8a8019391: Claiming fa:16:3e:bd:d9:0e 10.100.0.10
Feb 16 17:43:38 compute-0 ovn_controller[96437]: 2026-02-16T17:43:38Z|00165|binding|INFO|Setting lport daae7c28-fd1e-4920-aadf-67a8a8019391 up in Southbound
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.203 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:d9:0e 10.100.0.10'], port_security=['fa:16:3e:bd:d9:0e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '55df1d91-66af-441a-b872-56282db361ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '11', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=daae7c28-fd1e-4920-aadf-67a8a8019391) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.206 105730 INFO neutron.agent.ovn.metadata.agent [-] Port daae7c28-fd1e-4920-aadf-67a8a8019391 in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 bound to our chassis
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.209 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.230 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a1c391-60ca-412a-b12c-68b182f34c99]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.267 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[df2d8087-fa62-4c60-adad-a11814e7e89b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.271 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[b55d3536-ddac-4ddd-ae0e-cfef32263646]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.296 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[42107ec8-e190-4ba6-8bc8-8f78764ae6ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.318 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[57bf5dea-5470-4c96-82eb-13d96077cf8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536207, 'reachable_time': 32285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213515, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:38 compute-0 nova_compute[186176]: 2026-02-16 17:43:38.333 186180 INFO nova.compute.manager [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Post operation of migration started
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.334 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b4742871-a9f7-475b-8220-9c8c39767602]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap94cafcd0-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536218, 'tstamp': 536218}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213516, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap94cafcd0-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536221, 'tstamp': 536221}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213516, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.337 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:38 compute-0 nova_compute[186176]: 2026-02-16 17:43:38.339 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:38 compute-0 nova_compute[186176]: 2026-02-16 17:43:38.340 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.341 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.341 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.342 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.342 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:43:38 compute-0 nova_compute[186176]: 2026-02-16 17:43:38.652 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-55df1d91-66af-441a-b872-56282db361ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:43:38 compute-0 nova_compute[186176]: 2026-02-16 17:43:38.652 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-55df1d91-66af-441a-b872-56282db361ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:43:38 compute-0 nova_compute[186176]: 2026-02-16 17:43:38.653 186180 DEBUG nova.network.neutron [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:43:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:38.702 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:39 compute-0 nova_compute[186176]: 2026-02-16 17:43:39.364 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:39 compute-0 nova_compute[186176]: 2026-02-16 17:43:39.545 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:39 compute-0 nova_compute[186176]: 2026-02-16 17:43:39.707 186180 DEBUG nova.network.neutron [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Updating instance_info_cache with network_info: [{"id": "daae7c28-fd1e-4920-aadf-67a8a8019391", "address": "fa:16:3e:bd:d9:0e", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaae7c28-fd", "ovs_interfaceid": "daae7c28-fd1e-4920-aadf-67a8a8019391", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:43:39 compute-0 nova_compute[186176]: 2026-02-16 17:43:39.727 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-55df1d91-66af-441a-b872-56282db361ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:43:39 compute-0 nova_compute[186176]: 2026-02-16 17:43:39.745 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:39 compute-0 nova_compute[186176]: 2026-02-16 17:43:39.746 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:39 compute-0 nova_compute[186176]: 2026-02-16 17:43:39.746 186180 DEBUG oslo_concurrency.lockutils [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:39 compute-0 nova_compute[186176]: 2026-02-16 17:43:39.754 186180 INFO nova.virt.libvirt.driver [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 17:43:39 compute-0 virtqemud[185389]: Domain id=16 name='instance-00000013' uuid=55df1d91-66af-441a-b872-56282db361ef is tainted: custom-monitor
Feb 16 17:43:40 compute-0 nova_compute[186176]: 2026-02-16 17:43:40.764 186180 INFO nova.virt.libvirt.driver [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 17:43:41 compute-0 nova_compute[186176]: 2026-02-16 17:43:41.772 186180 INFO nova.virt.libvirt.driver [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 17:43:41 compute-0 nova_compute[186176]: 2026-02-16 17:43:41.778 186180 DEBUG nova.compute.manager [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:43:41 compute-0 nova_compute[186176]: 2026-02-16 17:43:41.809 186180 DEBUG nova.objects.instance [None req-c6dcde67-fe7f-48ac-b938-ca6aa575bb4c b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 17:43:44 compute-0 nova_compute[186176]: 2026-02-16 17:43:44.367 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:44 compute-0 nova_compute[186176]: 2026-02-16 17:43:44.548 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.333 186180 DEBUG oslo_concurrency.lockutils [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.334 186180 DEBUG oslo_concurrency.lockutils [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.335 186180 DEBUG oslo_concurrency.lockutils [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.335 186180 DEBUG oslo_concurrency.lockutils [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.336 186180 DEBUG oslo_concurrency.lockutils [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.338 186180 INFO nova.compute.manager [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Terminating instance
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.340 186180 DEBUG nova.compute.manager [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:43:46 compute-0 kernel: tap749dcde6-e0 (unregistering): left promiscuous mode
Feb 16 17:43:46 compute-0 NetworkManager[56463]: <info>  [1771263826.3694] device (tap749dcde6-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.380 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:46 compute-0 ovn_controller[96437]: 2026-02-16T17:43:46Z|00166|binding|INFO|Releasing lport 749dcde6-e031-42ae-bf1f-5aa5824cd2c9 from this chassis (sb_readonly=0)
Feb 16 17:43:46 compute-0 ovn_controller[96437]: 2026-02-16T17:43:46Z|00167|binding|INFO|Setting lport 749dcde6-e031-42ae-bf1f-5aa5824cd2c9 down in Southbound
Feb 16 17:43:46 compute-0 ovn_controller[96437]: 2026-02-16T17:43:46Z|00168|binding|INFO|Removing iface tap749dcde6-e0 ovn-installed in OVS
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.387 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.394 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:81:a1:a7 10.100.0.8'], port_security=['fa:16:3e:81:a1:a7 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'b704e6db-26a3-4e50-a981-c2e7f6d427f2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=749dcde6-e031-42ae-bf1f-5aa5824cd2c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.394 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.396 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 749dcde6-e031-42ae-bf1f-5aa5824cd2c9 in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.398 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.414 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6c55dc-5086-45de-8613-849e22b05307]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:46 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000014.scope: Deactivated successfully.
Feb 16 17:43:46 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000014.scope: Consumed 14.422s CPU time.
Feb 16 17:43:46 compute-0 systemd-machined[155631]: Machine qemu-15-instance-00000014 terminated.
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.439 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[731e4846-8e5f-4acd-8d08-63ecdcd8d0fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.443 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc1def0-71ac-48ad-8394-d8f4a258573d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.472 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[806ff51f-0c73-4067-92d8-986b3367b71a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.496 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[80bab12d-d91d-4581-a074-dd2abf8d43ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536207, 'reachable_time': 32285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213534, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.516 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ba251bbf-8d10-4a07-aa6b-36096bfbe731]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap94cafcd0-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536218, 'tstamp': 536218}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213535, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap94cafcd0-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 536221, 'tstamp': 536221}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213535, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.518 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.520 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.525 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.525 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.525 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.526 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:46.526 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.608 186180 INFO nova.virt.libvirt.driver [-] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Instance destroyed successfully.
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.609 186180 DEBUG nova.objects.instance [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'resources' on Instance uuid b704e6db-26a3-4e50-a981-c2e7f6d427f2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.622 186180 DEBUG nova.virt.libvirt.vif [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:42:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2002175474',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2002175474',id=20,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:42:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-9fax3nrp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:42:47Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=b704e6db-26a3-4e50-a981-c2e7f6d427f2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "address": "fa:16:3e:81:a1:a7", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap749dcde6-e0", "ovs_interfaceid": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.622 186180 DEBUG nova.network.os_vif_util [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "address": "fa:16:3e:81:a1:a7", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap749dcde6-e0", "ovs_interfaceid": "749dcde6-e031-42ae-bf1f-5aa5824cd2c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.623 186180 DEBUG nova.network.os_vif_util [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:81:a1:a7,bridge_name='br-int',has_traffic_filtering=True,id=749dcde6-e031-42ae-bf1f-5aa5824cd2c9,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap749dcde6-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.624 186180 DEBUG os_vif [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:a1:a7,bridge_name='br-int',has_traffic_filtering=True,id=749dcde6-e031-42ae-bf1f-5aa5824cd2c9,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap749dcde6-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.627 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.628 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap749dcde6-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.630 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.631 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.631 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.635 186180 INFO os_vif [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:81:a1:a7,bridge_name='br-int',has_traffic_filtering=True,id=749dcde6-e031-42ae-bf1f-5aa5824cd2c9,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap749dcde6-e0')
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.636 186180 INFO nova.virt.libvirt.driver [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Deleting instance files /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2_del
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.637 186180 INFO nova.virt.libvirt.driver [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Deletion of /var/lib/nova/instances/b704e6db-26a3-4e50-a981-c2e7f6d427f2_del complete
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.686 186180 INFO nova.compute.manager [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.687 186180 DEBUG oslo.service.loopingcall [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.687 186180 DEBUG nova.compute.manager [-] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:43:46 compute-0 nova_compute[186176]: 2026-02-16 17:43:46.688 186180 DEBUG nova.network.neutron [-] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:43:47 compute-0 nova_compute[186176]: 2026-02-16 17:43:47.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:43:47 compute-0 nova_compute[186176]: 2026-02-16 17:43:47.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:43:47 compute-0 nova_compute[186176]: 2026-02-16 17:43:47.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:43:47 compute-0 nova_compute[186176]: 2026-02-16 17:43:47.352 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Feb 16 17:43:47 compute-0 nova_compute[186176]: 2026-02-16 17:43:47.435 186180 DEBUG nova.compute.manager [req-01f98b6e-f40b-4c1e-9b3a-97cfe36d627a req-db199f77-48f7-4e07-a72d-0f5cf3c6bd4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Received event network-vif-unplugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:43:47 compute-0 nova_compute[186176]: 2026-02-16 17:43:47.436 186180 DEBUG oslo_concurrency.lockutils [req-01f98b6e-f40b-4c1e-9b3a-97cfe36d627a req-db199f77-48f7-4e07-a72d-0f5cf3c6bd4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:47 compute-0 nova_compute[186176]: 2026-02-16 17:43:47.437 186180 DEBUG oslo_concurrency.lockutils [req-01f98b6e-f40b-4c1e-9b3a-97cfe36d627a req-db199f77-48f7-4e07-a72d-0f5cf3c6bd4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:47 compute-0 nova_compute[186176]: 2026-02-16 17:43:47.437 186180 DEBUG oslo_concurrency.lockutils [req-01f98b6e-f40b-4c1e-9b3a-97cfe36d627a req-db199f77-48f7-4e07-a72d-0f5cf3c6bd4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:47 compute-0 nova_compute[186176]: 2026-02-16 17:43:47.437 186180 DEBUG nova.compute.manager [req-01f98b6e-f40b-4c1e-9b3a-97cfe36d627a req-db199f77-48f7-4e07-a72d-0f5cf3c6bd4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] No waiting events found dispatching network-vif-unplugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:43:47 compute-0 nova_compute[186176]: 2026-02-16 17:43:47.437 186180 DEBUG nova.compute.manager [req-01f98b6e-f40b-4c1e-9b3a-97cfe36d627a req-db199f77-48f7-4e07-a72d-0f5cf3c6bd4d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Received event network-vif-unplugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.204 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-55df1d91-66af-441a-b872-56282db361ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.205 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-55df1d91-66af-441a-b872-56282db361ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.205 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 55df1d91-66af-441a-b872-56282db361ef] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.205 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 55df1d91-66af-441a-b872-56282db361ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.582 186180 DEBUG nova.network.neutron [-] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.613 186180 INFO nova.compute.manager [-] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Took 1.93 seconds to deallocate network for instance.
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.685 186180 DEBUG oslo_concurrency.lockutils [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.686 186180 DEBUG oslo_concurrency.lockutils [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.764 186180 DEBUG nova.compute.provider_tree [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.783 186180 DEBUG nova.scheduler.client.report [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.824 186180 DEBUG oslo_concurrency.lockutils [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.868 186180 INFO nova.scheduler.client.report [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Deleted allocations for instance b704e6db-26a3-4e50-a981-c2e7f6d427f2
Feb 16 17:43:48 compute-0 nova_compute[186176]: 2026-02-16 17:43:48.955 186180 DEBUG oslo_concurrency.lockutils [None req-b7328f28-30fa-4535-bdce-159eb2739897 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.369 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.407 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 55df1d91-66af-441a-b872-56282db361ef] Updating instance_info_cache with network_info: [{"id": "daae7c28-fd1e-4920-aadf-67a8a8019391", "address": "fa:16:3e:bd:d9:0e", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaae7c28-fd", "ovs_interfaceid": "daae7c28-fd1e-4920-aadf-67a8a8019391", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.434 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-55df1d91-66af-441a-b872-56282db361ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.434 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 55df1d91-66af-441a-b872-56282db361ef] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.435 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.465 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.466 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.466 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.467 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.479 186180 DEBUG oslo_concurrency.lockutils [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "55df1d91-66af-441a-b872-56282db361ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.480 186180 DEBUG oslo_concurrency.lockutils [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "55df1d91-66af-441a-b872-56282db361ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.480 186180 DEBUG oslo_concurrency.lockutils [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "55df1d91-66af-441a-b872-56282db361ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.481 186180 DEBUG oslo_concurrency.lockutils [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "55df1d91-66af-441a-b872-56282db361ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.481 186180 DEBUG oslo_concurrency.lockutils [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "55df1d91-66af-441a-b872-56282db361ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.483 186180 INFO nova.compute.manager [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Terminating instance
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.485 186180 DEBUG nova.compute.manager [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:43:49 compute-0 kernel: tapdaae7c28-fd (unregistering): left promiscuous mode
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.510 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 NetworkManager[56463]: <info>  [1771263829.5120] device (tapdaae7c28-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:43:49 compute-0 ovn_controller[96437]: 2026-02-16T17:43:49Z|00169|binding|INFO|Releasing lport daae7c28-fd1e-4920-aadf-67a8a8019391 from this chassis (sb_readonly=0)
Feb 16 17:43:49 compute-0 ovn_controller[96437]: 2026-02-16T17:43:49Z|00170|binding|INFO|Setting lport daae7c28-fd1e-4920-aadf-67a8a8019391 down in Southbound
Feb 16 17:43:49 compute-0 ovn_controller[96437]: 2026-02-16T17:43:49Z|00171|binding|INFO|Removing iface tapdaae7c28-fd ovn-installed in OVS
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.519 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.525 186180 DEBUG nova.compute.manager [req-3c431c16-5c02-4dba-9434-d1b95b04abc5 req-ee6cc282-f3ff-4fe0-a875-af3dfdff2cd5 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Received event network-vif-plugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.526 186180 DEBUG oslo_concurrency.lockutils [req-3c431c16-5c02-4dba-9434-d1b95b04abc5 req-ee6cc282-f3ff-4fe0-a875-af3dfdff2cd5 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.526 186180 DEBUG oslo_concurrency.lockutils [req-3c431c16-5c02-4dba-9434-d1b95b04abc5 req-ee6cc282-f3ff-4fe0-a875-af3dfdff2cd5 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.527 186180 DEBUG oslo_concurrency.lockutils [req-3c431c16-5c02-4dba-9434-d1b95b04abc5 req-ee6cc282-f3ff-4fe0-a875-af3dfdff2cd5 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "b704e6db-26a3-4e50-a981-c2e7f6d427f2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.527 186180 DEBUG nova.compute.manager [req-3c431c16-5c02-4dba-9434-d1b95b04abc5 req-ee6cc282-f3ff-4fe0-a875-af3dfdff2cd5 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] No waiting events found dispatching network-vif-plugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.527 186180 WARNING nova.compute.manager [req-3c431c16-5c02-4dba-9434-d1b95b04abc5 req-ee6cc282-f3ff-4fe0-a875-af3dfdff2cd5 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Received unexpected event network-vif-plugged-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 for instance with vm_state deleted and task_state None.
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.528 186180 DEBUG nova.compute.manager [req-3c431c16-5c02-4dba-9434-d1b95b04abc5 req-ee6cc282-f3ff-4fe0-a875-af3dfdff2cd5 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Received event network-vif-deleted-749dcde6-e031-42ae-bf1f-5aa5824cd2c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.528 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.548 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:d9:0e 10.100.0.10'], port_security=['fa:16:3e:bd:d9:0e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '55df1d91-66af-441a-b872-56282db361ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '13', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=daae7c28-fd1e-4920-aadf-67a8a8019391) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.550 105730 INFO neutron.agent.ovn.metadata.agent [-] Port daae7c28-fd1e-4920-aadf-67a8a8019391 in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.551 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.552 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[46f51868-7ec6-429b-82eb-2dc7f977bfa9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.553 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace which is not needed anymore
Feb 16 17:43:49 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000013.scope: Deactivated successfully.
Feb 16 17:43:49 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000013.scope: Consumed 1.656s CPU time.
Feb 16 17:43:49 compute-0 systemd-machined[155631]: Machine qemu-16-instance-00000013 terminated.
Feb 16 17:43:49 compute-0 NetworkManager[56463]: <info>  [1771263829.7050] manager: (tapdaae7c28-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.706 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.710 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213207]: [NOTICE]   (213211) : haproxy version is 2.8.14-c23fe91
Feb 16 17:43:49 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213207]: [NOTICE]   (213211) : path to executable is /usr/sbin/haproxy
Feb 16 17:43:49 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213207]: [WARNING]  (213211) : Exiting Master process...
Feb 16 17:43:49 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213207]: [ALERT]    (213211) : Current worker (213213) exited with code 143 (Terminated)
Feb 16 17:43:49 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213207]: [WARNING]  (213211) : All workers exited. Exiting... (0)
Feb 16 17:43:49 compute-0 systemd[1]: libpod-929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3.scope: Deactivated successfully.
Feb 16 17:43:49 compute-0 podman[213578]: 2026-02-16 17:43:49.723020019 +0000 UTC m=+0.053467232 container died 929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.746 186180 INFO nova.virt.libvirt.driver [-] [instance: 55df1d91-66af-441a-b872-56282db361ef] Instance destroyed successfully.
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.749 186180 DEBUG nova.objects.instance [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'resources' on Instance uuid 55df1d91-66af-441a-b872-56282db361ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:43:49 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3-userdata-shm.mount: Deactivated successfully.
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.759 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:43:49 compute-0 systemd[1]: var-lib-containers-storage-overlay-c7166f49b29002ffd959ef8ada4ce17b31ed6fa820710b87559578e1c934d2e2-merged.mount: Deactivated successfully.
Feb 16 17:43:49 compute-0 podman[213578]: 2026-02-16 17:43:49.770193267 +0000 UTC m=+0.100640480 container cleanup 929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.779 186180 DEBUG nova.virt.libvirt.vif [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T17:42:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-264482010',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-264482010',id=19,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:42:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-1ntz2m5a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:43:41Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=55df1d91-66af-441a-b872-56282db361ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "daae7c28-fd1e-4920-aadf-67a8a8019391", "address": "fa:16:3e:bd:d9:0e", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaae7c28-fd", "ovs_interfaceid": "daae7c28-fd1e-4920-aadf-67a8a8019391", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.780 186180 DEBUG nova.network.os_vif_util [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "daae7c28-fd1e-4920-aadf-67a8a8019391", "address": "fa:16:3e:bd:d9:0e", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdaae7c28-fd", "ovs_interfaceid": "daae7c28-fd1e-4920-aadf-67a8a8019391", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.781 186180 DEBUG nova.network.os_vif_util [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:bd:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=daae7c28-fd1e-4920-aadf-67a8a8019391,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaae7c28-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.782 186180 DEBUG os_vif [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=daae7c28-fd1e-4920-aadf-67a8a8019391,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaae7c28-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.784 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.785 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdaae7c28-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:49 compute-0 systemd[1]: libpod-conmon-929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3.scope: Deactivated successfully.
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.787 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.788 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.791 186180 INFO os_vif [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:bd:d9:0e,bridge_name='br-int',has_traffic_filtering=True,id=daae7c28-fd1e-4920-aadf-67a8a8019391,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdaae7c28-fd')
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.792 186180 INFO nova.virt.libvirt.driver [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Deleting instance files /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef_del
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.793 186180 INFO nova.virt.libvirt.driver [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Deletion of /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef_del complete
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.825 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk --force-share --output=json" returned: 1 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.826 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] '/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk --force-share --output=json' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.826 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000013, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/55df1d91-66af-441a-b872-56282db361ef/disk
Feb 16 17:43:49 compute-0 podman[213622]: 2026-02-16 17:43:49.847120114 +0000 UTC m=+0.054426535 container remove 929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.854 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab12179-9b61-4a6f-ae29-30bb678a349f]: (4, ('Mon Feb 16 05:43:49 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3)\n929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3\nMon Feb 16 05:43:49 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3)\n929da0ea7c9983dc188f1b934feafd647a9271b2e395a4992e5df4c76333dcd3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.856 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[81fcfdc1-e6fa-4573-8700-31728ad3d79a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.857 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.859 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 kernel: tap94cafcd0-c0: left promiscuous mode
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.868 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.871 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ec047f32-4ca9-49d0-8de3-df6416d6200f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.878 186180 INFO nova.compute.manager [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Took 0.39 seconds to destroy the instance on the hypervisor.
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.878 186180 DEBUG oslo.service.loopingcall [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.879 186180 DEBUG nova.compute.manager [-] [instance: 55df1d91-66af-441a-b872-56282db361ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.879 186180 DEBUG nova.network.neutron [-] [instance: 55df1d91-66af-441a-b872-56282db361ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.890 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[1d9b2f25-9421-4363-a11c-34fa806af712]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.892 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[d440b0f8-8ac3-456a-aee2-113186dc6411]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.903 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[092172b6-5b41-4dd2-b457-7cbf592eab6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 536201, 'reachable_time': 27073, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213639, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.907 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:43:49 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:43:49.907 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[639d9b57-8a65-4f8c-82b9-11da858f3ab9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:43:49 compute-0 systemd[1]: run-netns-ovnmeta\x2d94cafcd0\x2dc7c2\x2d48b4\x2da2dd\x2d21c16ce48dc4.mount: Deactivated successfully.
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.981 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.982 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5699MB free_disk=73.19491958618164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.983 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:49 compute-0 nova_compute[186176]: 2026-02-16 17:43:49.983 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.125 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance 55df1d91-66af-441a-b872-56282db361ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.126 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.126 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.169 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.227 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.354 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.355 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.463 186180 DEBUG nova.compute.manager [req-9d8b9c34-24ac-4635-9a60-090bf1dc9f77 req-340da64f-b7a9-4409-a161-82f916b4e798 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Received event network-vif-unplugged-daae7c28-fd1e-4920-aadf-67a8a8019391 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.463 186180 DEBUG oslo_concurrency.lockutils [req-9d8b9c34-24ac-4635-9a60-090bf1dc9f77 req-340da64f-b7a9-4409-a161-82f916b4e798 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "55df1d91-66af-441a-b872-56282db361ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.464 186180 DEBUG oslo_concurrency.lockutils [req-9d8b9c34-24ac-4635-9a60-090bf1dc9f77 req-340da64f-b7a9-4409-a161-82f916b4e798 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "55df1d91-66af-441a-b872-56282db361ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.464 186180 DEBUG oslo_concurrency.lockutils [req-9d8b9c34-24ac-4635-9a60-090bf1dc9f77 req-340da64f-b7a9-4409-a161-82f916b4e798 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "55df1d91-66af-441a-b872-56282db361ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.465 186180 DEBUG nova.compute.manager [req-9d8b9c34-24ac-4635-9a60-090bf1dc9f77 req-340da64f-b7a9-4409-a161-82f916b4e798 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] No waiting events found dispatching network-vif-unplugged-daae7c28-fd1e-4920-aadf-67a8a8019391 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:43:50 compute-0 nova_compute[186176]: 2026-02-16 17:43:50.465 186180 DEBUG nova.compute.manager [req-9d8b9c34-24ac-4635-9a60-090bf1dc9f77 req-340da64f-b7a9-4409-a161-82f916b4e798 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Received event network-vif-unplugged-daae7c28-fd1e-4920-aadf-67a8a8019391 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.208 186180 DEBUG nova.network.neutron [-] [instance: 55df1d91-66af-441a-b872-56282db361ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.228 186180 INFO nova.compute.manager [-] [instance: 55df1d91-66af-441a-b872-56282db361ef] Took 1.35 seconds to deallocate network for instance.
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.236 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.268 186180 DEBUG oslo_concurrency.lockutils [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.269 186180 DEBUG oslo_concurrency.lockutils [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.323 186180 DEBUG nova.compute.provider_tree [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.346 186180 DEBUG nova.scheduler.client.report [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.378 186180 DEBUG oslo_concurrency.lockutils [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.398 186180 INFO nova.scheduler.client.report [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Deleted allocations for instance 55df1d91-66af-441a-b872-56282db361ef
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.457 186180 DEBUG oslo_concurrency.lockutils [None req-81b9075b-57c1-4590-9e74-43d84ffeb892 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "55df1d91-66af-441a-b872-56282db361ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:51 compute-0 nova_compute[186176]: 2026-02-16 17:43:51.605 186180 DEBUG nova.compute.manager [req-f1f43aab-edb1-4c54-a110-3e1370a180b8 req-459f5e6d-4172-4ce6-898e-9be8ad977e53 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Received event network-vif-deleted-daae7c28-fd1e-4920-aadf-67a8a8019391 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:43:52 compute-0 nova_compute[186176]: 2026-02-16 17:43:52.558 186180 DEBUG nova.compute.manager [req-7b139927-823e-485f-9a5f-2180cc3d6767 req-d7561223-00ab-41d3-a85b-89867d8093d3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Received event network-vif-plugged-daae7c28-fd1e-4920-aadf-67a8a8019391 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:43:52 compute-0 nova_compute[186176]: 2026-02-16 17:43:52.559 186180 DEBUG oslo_concurrency.lockutils [req-7b139927-823e-485f-9a5f-2180cc3d6767 req-d7561223-00ab-41d3-a85b-89867d8093d3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "55df1d91-66af-441a-b872-56282db361ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:43:52 compute-0 nova_compute[186176]: 2026-02-16 17:43:52.559 186180 DEBUG oslo_concurrency.lockutils [req-7b139927-823e-485f-9a5f-2180cc3d6767 req-d7561223-00ab-41d3-a85b-89867d8093d3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "55df1d91-66af-441a-b872-56282db361ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:43:52 compute-0 nova_compute[186176]: 2026-02-16 17:43:52.559 186180 DEBUG oslo_concurrency.lockutils [req-7b139927-823e-485f-9a5f-2180cc3d6767 req-d7561223-00ab-41d3-a85b-89867d8093d3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "55df1d91-66af-441a-b872-56282db361ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:43:52 compute-0 nova_compute[186176]: 2026-02-16 17:43:52.559 186180 DEBUG nova.compute.manager [req-7b139927-823e-485f-9a5f-2180cc3d6767 req-d7561223-00ab-41d3-a85b-89867d8093d3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] No waiting events found dispatching network-vif-plugged-daae7c28-fd1e-4920-aadf-67a8a8019391 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:43:52 compute-0 nova_compute[186176]: 2026-02-16 17:43:52.560 186180 WARNING nova.compute.manager [req-7b139927-823e-485f-9a5f-2180cc3d6767 req-d7561223-00ab-41d3-a85b-89867d8093d3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 55df1d91-66af-441a-b872-56282db361ef] Received unexpected event network-vif-plugged-daae7c28-fd1e-4920-aadf-67a8a8019391 for instance with vm_state deleted and task_state None.
Feb 16 17:43:53 compute-0 nova_compute[186176]: 2026-02-16 17:43:53.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:43:53 compute-0 nova_compute[186176]: 2026-02-16 17:43:53.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:43:53 compute-0 nova_compute[186176]: 2026-02-16 17:43:53.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:43:53 compute-0 nova_compute[186176]: 2026-02-16 17:43:53.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:43:54 compute-0 nova_compute[186176]: 2026-02-16 17:43:54.423 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:54 compute-0 nova_compute[186176]: 2026-02-16 17:43:54.787 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:55 compute-0 nova_compute[186176]: 2026-02-16 17:43:55.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:43:56 compute-0 nova_compute[186176]: 2026-02-16 17:43:56.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:43:58 compute-0 podman[213641]: 2026-02-16 17:43:58.118713157 +0000 UTC m=+0.085606198 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, release=1770267347, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z)
Feb 16 17:43:59 compute-0 nova_compute[186176]: 2026-02-16 17:43:59.425 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:43:59 compute-0 podman[195505]: time="2026-02-16T17:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:43:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:43:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 17:43:59 compute-0 nova_compute[186176]: 2026-02-16 17:43:59.789 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:00 compute-0 podman[213665]: 2026-02-16 17:44:00.115171186 +0000 UTC m=+0.074819437 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:44:01 compute-0 openstack_network_exporter[198360]: ERROR   17:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:44:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:44:01 compute-0 openstack_network_exporter[198360]: ERROR   17:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:44:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:44:01 compute-0 nova_compute[186176]: 2026-02-16 17:44:01.607 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771263826.6055355, b704e6db-26a3-4e50-a981-c2e7f6d427f2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:44:01 compute-0 nova_compute[186176]: 2026-02-16 17:44:01.610 186180 INFO nova.compute.manager [-] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] VM Stopped (Lifecycle Event)
Feb 16 17:44:01 compute-0 nova_compute[186176]: 2026-02-16 17:44:01.630 186180 DEBUG nova.compute.manager [None req-af27b921-b35a-49a8-be9c-dae0ac659fc2 - - - - - -] [instance: b704e6db-26a3-4e50-a981-c2e7f6d427f2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:44:02 compute-0 nova_compute[186176]: 2026-02-16 17:44:02.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:44:04 compute-0 nova_compute[186176]: 2026-02-16 17:44:04.428 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:04 compute-0 nova_compute[186176]: 2026-02-16 17:44:04.743 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771263829.7413354, 55df1d91-66af-441a-b872-56282db361ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:44:04 compute-0 nova_compute[186176]: 2026-02-16 17:44:04.743 186180 INFO nova.compute.manager [-] [instance: 55df1d91-66af-441a-b872-56282db361ef] VM Stopped (Lifecycle Event)
Feb 16 17:44:04 compute-0 nova_compute[186176]: 2026-02-16 17:44:04.765 186180 DEBUG nova.compute.manager [None req-3a152bf5-5083-4826-8800-e6d2c3f40d23 - - - - - -] [instance: 55df1d91-66af-441a-b872-56282db361ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:44:04 compute-0 nova_compute[186176]: 2026-02-16 17:44:04.792 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:06 compute-0 podman[213686]: 2026-02-16 17:44:06.109306187 +0000 UTC m=+0.067883639 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:44:06 compute-0 podman[213685]: 2026-02-16 17:44:06.171911059 +0000 UTC m=+0.133439512 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 17:44:09 compute-0 nova_compute[186176]: 2026-02-16 17:44:09.432 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:09 compute-0 nova_compute[186176]: 2026-02-16 17:44:09.794 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:14 compute-0 nova_compute[186176]: 2026-02-16 17:44:14.465 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:14 compute-0 nova_compute[186176]: 2026-02-16 17:44:14.796 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:19 compute-0 nova_compute[186176]: 2026-02-16 17:44:19.468 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:19 compute-0 nova_compute[186176]: 2026-02-16 17:44:19.798 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:24 compute-0 nova_compute[186176]: 2026-02-16 17:44:24.471 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:24 compute-0 nova_compute[186176]: 2026-02-16 17:44:24.801 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:29 compute-0 podman[213735]: 2026-02-16 17:44:29.114377307 +0000 UTC m=+0.083838056 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9/ubi-minimal, release=1770267347, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, version=9.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7)
Feb 16 17:44:29 compute-0 nova_compute[186176]: 2026-02-16 17:44:29.474 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:29 compute-0 podman[195505]: time="2026-02-16T17:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:44:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:44:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 17:44:29 compute-0 nova_compute[186176]: 2026-02-16 17:44:29.803 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:31 compute-0 podman[213758]: 2026-02-16 17:44:31.113468193 +0000 UTC m=+0.075011319 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:44:31 compute-0 openstack_network_exporter[198360]: ERROR   17:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:44:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:44:31 compute-0 openstack_network_exporter[198360]: ERROR   17:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:44:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:44:32 compute-0 nova_compute[186176]: 2026-02-16 17:44:32.266 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:32 compute-0 nova_compute[186176]: 2026-02-16 17:44:32.267 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:32 compute-0 nova_compute[186176]: 2026-02-16 17:44:32.290 186180 DEBUG nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:44:32 compute-0 nova_compute[186176]: 2026-02-16 17:44:32.396 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:32 compute-0 nova_compute[186176]: 2026-02-16 17:44:32.396 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:32 compute-0 nova_compute[186176]: 2026-02-16 17:44:32.408 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:44:32 compute-0 nova_compute[186176]: 2026-02-16 17:44:32.408 186180 INFO nova.compute.claims [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:44:32 compute-0 nova_compute[186176]: 2026-02-16 17:44:32.701 186180 DEBUG nova.compute.provider_tree [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:44:32 compute-0 nova_compute[186176]: 2026-02-16 17:44:32.725 186180 DEBUG nova.scheduler.client.report [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.004 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.005 186180 DEBUG nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.098 186180 DEBUG nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.099 186180 DEBUG nova.network.neutron [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.142 186180 INFO nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.178 186180 DEBUG nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.329 186180 DEBUG nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.330 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.330 186180 INFO nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Creating image(s)
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.330 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "/var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.331 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.331 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.347 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.431 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.432 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.433 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.451 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.512 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.513 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.540 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.542 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.543 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.626 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.628 186180 DEBUG nova.virt.disk.api [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Checking if we can resize image /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.629 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.692 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.693 186180 DEBUG nova.virt.disk.api [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Cannot resize image /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.694 186180 DEBUG nova.objects.instance [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'migration_context' on Instance uuid 8fd003a2-24e8-4868-8c36-8795ad9aefd7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:44:33 compute-0 nova_compute[186176]: 2026-02-16 17:44:33.956 186180 DEBUG nova.policy [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54934f49b2044289bcf127662fe114b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:44:34 compute-0 nova_compute[186176]: 2026-02-16 17:44:34.476 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:34 compute-0 nova_compute[186176]: 2026-02-16 17:44:34.805 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:35 compute-0 nova_compute[186176]: 2026-02-16 17:44:35.287 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:44:35 compute-0 nova_compute[186176]: 2026-02-16 17:44:35.288 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Ensure instance console log exists: /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:44:35 compute-0 nova_compute[186176]: 2026-02-16 17:44:35.289 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:35 compute-0 nova_compute[186176]: 2026-02-16 17:44:35.290 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:35 compute-0 nova_compute[186176]: 2026-02-16 17:44:35.290 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:35 compute-0 nova_compute[186176]: 2026-02-16 17:44:35.790 186180 DEBUG nova.network.neutron [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Successfully created port: ea4ec714-57e6-4a38-9844-3dde4e2e085c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:44:37 compute-0 nova_compute[186176]: 2026-02-16 17:44:37.017 186180 DEBUG nova.network.neutron [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Successfully updated port: ea4ec714-57e6-4a38-9844-3dde4e2e085c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:44:37 compute-0 nova_compute[186176]: 2026-02-16 17:44:37.053 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:44:37 compute-0 nova_compute[186176]: 2026-02-16 17:44:37.053 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquired lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:44:37 compute-0 nova_compute[186176]: 2026-02-16 17:44:37.054 186180 DEBUG nova.network.neutron [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:44:37 compute-0 podman[213794]: 2026-02-16 17:44:37.120013379 +0000 UTC m=+0.076619769 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:44:37 compute-0 podman[213793]: 2026-02-16 17:44:37.154682989 +0000 UTC m=+0.118715651 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Feb 16 17:44:37 compute-0 nova_compute[186176]: 2026-02-16 17:44:37.183 186180 DEBUG nova.compute.manager [req-20675996-f8e6-49a1-8b52-41a43b7c8642 req-6b990ba5-8e0a-4387-a82f-e3965fa4fe68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-changed-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:44:37 compute-0 nova_compute[186176]: 2026-02-16 17:44:37.184 186180 DEBUG nova.compute.manager [req-20675996-f8e6-49a1-8b52-41a43b7c8642 req-6b990ba5-8e0a-4387-a82f-e3965fa4fe68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Refreshing instance network info cache due to event network-changed-ea4ec714-57e6-4a38-9844-3dde4e2e085c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:44:37 compute-0 nova_compute[186176]: 2026-02-16 17:44:37.184 186180 DEBUG oslo_concurrency.lockutils [req-20675996-f8e6-49a1-8b52-41a43b7c8642 req-6b990ba5-8e0a-4387-a82f-e3965fa4fe68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:44:37 compute-0 nova_compute[186176]: 2026-02-16 17:44:37.862 186180 DEBUG nova.network.neutron [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:44:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:38.176 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:38.177 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:38.177 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:38 compute-0 nova_compute[186176]: 2026-02-16 17:44:38.914 186180 DEBUG nova.network.neutron [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Updating instance_info_cache with network_info: [{"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.452 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Releasing lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.452 186180 DEBUG nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Instance network_info: |[{"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.453 186180 DEBUG oslo_concurrency.lockutils [req-20675996-f8e6-49a1-8b52-41a43b7c8642 req-6b990ba5-8e0a-4387-a82f-e3965fa4fe68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.454 186180 DEBUG nova.network.neutron [req-20675996-f8e6-49a1-8b52-41a43b7c8642 req-6b990ba5-8e0a-4387-a82f-e3965fa4fe68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Refreshing network info cache for port ea4ec714-57e6-4a38-9844-3dde4e2e085c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.459 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Start _get_guest_xml network_info=[{"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.466 186180 WARNING nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.473 186180 DEBUG nova.virt.libvirt.host [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.474 186180 DEBUG nova.virt.libvirt.host [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.483 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.487 186180 DEBUG nova.virt.libvirt.host [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.488 186180 DEBUG nova.virt.libvirt.host [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.489 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.490 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.491 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.491 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.492 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.492 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.492 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.493 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.493 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.494 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.494 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.494 186180 DEBUG nova.virt.hardware [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.501 186180 DEBUG nova.virt.libvirt.vif [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1847915730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1847915730',id=22,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-yb2z0gbq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:44:33Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=8fd003a2-24e8-4868-8c36-8795ad9aefd7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.502 186180 DEBUG nova.network.os_vif_util [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.503 186180 DEBUG nova.network.os_vif_util [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:4e:f2,bridge_name='br-int',has_traffic_filtering=True,id=ea4ec714-57e6-4a38-9844-3dde4e2e085c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea4ec714-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.504 186180 DEBUG nova.objects.instance [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'pci_devices' on Instance uuid 8fd003a2-24e8-4868-8c36-8795ad9aefd7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.520 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <uuid>8fd003a2-24e8-4868-8c36-8795ad9aefd7</uuid>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <name>instance-00000016</name>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteStrategies-server-1847915730</nova:name>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:44:39</nova:creationTime>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:44:39 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:44:39 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:44:39 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:44:39 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:44:39 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:44:39 compute-0 nova_compute[186176]:         <nova:user uuid="c54934f49b2044289bcf127662fe114b">tempest-TestExecuteStrategies-1098930400-project-member</nova:user>
Feb 16 17:44:39 compute-0 nova_compute[186176]:         <nova:project uuid="1a237c4b00c5426cb1dc6afe3c7c868c">tempest-TestExecuteStrategies-1098930400</nova:project>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:44:39 compute-0 nova_compute[186176]:         <nova:port uuid="ea4ec714-57e6-4a38-9844-3dde4e2e085c">
Feb 16 17:44:39 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <system>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <entry name="serial">8fd003a2-24e8-4868-8c36-8795ad9aefd7</entry>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <entry name="uuid">8fd003a2-24e8-4868-8c36-8795ad9aefd7</entry>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     </system>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <os>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   </os>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <features>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   </features>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk.config"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:a4:4e:f2"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <target dev="tapea4ec714-57"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/console.log" append="off"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <video>
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     </video>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:44:39 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:44:39 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:44:39 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:44:39 compute-0 nova_compute[186176]: </domain>
Feb 16 17:44:39 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.522 186180 DEBUG nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Preparing to wait for external event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.522 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.523 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.523 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.524 186180 DEBUG nova.virt.libvirt.vif [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1847915730',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1847915730',id=22,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-yb2z0gbq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:44:33Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=8fd003a2-24e8-4868-8c36-8795ad9aefd7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.525 186180 DEBUG nova.network.os_vif_util [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.526 186180 DEBUG nova.network.os_vif_util [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:4e:f2,bridge_name='br-int',has_traffic_filtering=True,id=ea4ec714-57e6-4a38-9844-3dde4e2e085c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea4ec714-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.526 186180 DEBUG os_vif [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:4e:f2,bridge_name='br-int',has_traffic_filtering=True,id=ea4ec714-57e6-4a38-9844-3dde4e2e085c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea4ec714-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.527 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.528 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.529 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.534 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.534 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea4ec714-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.535 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea4ec714-57, col_values=(('external_ids', {'iface-id': 'ea4ec714-57e6-4a38-9844-3dde4e2e085c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:4e:f2', 'vm-uuid': '8fd003a2-24e8-4868-8c36-8795ad9aefd7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.537 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:39 compute-0 NetworkManager[56463]: <info>  [1771263879.5390] manager: (tapea4ec714-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.541 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.545 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.547 186180 INFO os_vif [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:4e:f2,bridge_name='br-int',has_traffic_filtering=True,id=ea4ec714-57e6-4a38-9844-3dde4e2e085c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea4ec714-57')
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.608 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.608 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.608 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No VIF found with MAC fa:16:3e:a4:4e:f2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:44:39 compute-0 nova_compute[186176]: 2026-02-16 17:44:39.609 186180 INFO nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Using config drive
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.211 186180 INFO nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Creating config drive at /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk.config
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.218 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpy0_w77lc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.342 186180 DEBUG oslo_concurrency.processutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpy0_w77lc" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:44:42 compute-0 kernel: tapea4ec714-57: entered promiscuous mode
Feb 16 17:44:42 compute-0 NetworkManager[56463]: <info>  [1771263882.4041] manager: (tapea4ec714-57): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Feb 16 17:44:42 compute-0 ovn_controller[96437]: 2026-02-16T17:44:42Z|00172|binding|INFO|Claiming lport ea4ec714-57e6-4a38-9844-3dde4e2e085c for this chassis.
Feb 16 17:44:42 compute-0 ovn_controller[96437]: 2026-02-16T17:44:42Z|00173|binding|INFO|ea4ec714-57e6-4a38-9844-3dde4e2e085c: Claiming fa:16:3e:a4:4e:f2 10.100.0.10
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.405 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:42 compute-0 ovn_controller[96437]: 2026-02-16T17:44:42Z|00174|binding|INFO|Setting lport ea4ec714-57e6-4a38-9844-3dde4e2e085c ovn-installed in OVS
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.410 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.413 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.415 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:42 compute-0 systemd-udevd[213858]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:44:42 compute-0 NetworkManager[56463]: <info>  [1771263882.4557] device (tapea4ec714-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:44:42 compute-0 NetworkManager[56463]: <info>  [1771263882.4569] device (tapea4ec714-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:44:42 compute-0 systemd-machined[155631]: New machine qemu-17-instance-00000016.
Feb 16 17:44:42 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000016.
Feb 16 17:44:42 compute-0 ovn_controller[96437]: 2026-02-16T17:44:42Z|00175|binding|INFO|Setting lport ea4ec714-57e6-4a38-9844-3dde4e2e085c up in Southbound
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.519 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:4e:f2 10.100.0.10'], port_security=['fa:16:3e:a4:4e:f2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8fd003a2-24e8-4868-8c36-8795ad9aefd7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=ea4ec714-57e6-4a38-9844-3dde4e2e085c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.522 105730 INFO neutron.agent.ovn.metadata.agent [-] Port ea4ec714-57e6-4a38-9844-3dde4e2e085c in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 bound to our chassis
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.523 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.541 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[eb879c78-4fe9-47c8-a8d3-c2fd27eaa116]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.542 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap94cafcd0-c1 in ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.544 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap94cafcd0-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.544 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[dc13796b-88ab-4d4a-81b6-11c085148095]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.545 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[9344ffa2-c5b4-4a21-a27f-8a02d65a12e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.562 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[6d3ecc12-a879-4878-994e-d4c110a608f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.578 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[57f9b125-50cd-4125-aa53-0a095fdae1a1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.611 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf4c06b-5805-4804-ba5d-afade9aaf1ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 systemd-udevd[213860]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:44:42 compute-0 NetworkManager[56463]: <info>  [1771263882.6226] manager: (tap94cafcd0-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.621 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[6fa2bbbe-b7b0-4ba3-8e20-44719add4adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.659 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[e8708d1d-8667-461a-a6e8-459c4c8ca9d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.665 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[e2e588f6-37ff-4ac2-aee4-f301469ad5fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 NetworkManager[56463]: <info>  [1771263882.6903] device (tap94cafcd0-c0): carrier: link connected
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.696 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[88eefa10-ec89-4df7-ab45-17e3a7d0e2b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.718 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[6e0ef747-3914-48d6-a21f-2ad1b354bca8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547821, 'reachable_time': 30716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213894, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.736 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[1c7d701b-847c-401b-88a8-43b448d34ab8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547821, 'tstamp': 547821}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213895, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.756 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3f186c08-cfc0-45b5-aa15-2cfa05a53661]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547821, 'reachable_time': 30716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213896, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.792 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3e600f81-8daf-4fc0-9e85-7840ce04acc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.867 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[99e2a6c2-8778-436b-81e3-315faebad12c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.869 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.869 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.870 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:44:42 compute-0 NetworkManager[56463]: <info>  [1771263882.8738] manager: (tap94cafcd0-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Feb 16 17:44:42 compute-0 kernel: tap94cafcd0-c0: entered promiscuous mode
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.873 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.877 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.879 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:42 compute-0 ovn_controller[96437]: 2026-02-16T17:44:42Z|00176|binding|INFO|Releasing lport 5c28d585-b48c-40c6-b5e7-f1e59317b2de from this chassis (sb_readonly=0)
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.885 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.886 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.887 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.889 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[d2dab153-d18a-4217-902a-66248a6a3404]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.891 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:44:42 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:44:42.892 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'env', 'PROCESS_TAG=haproxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.923 186180 DEBUG nova.compute.manager [req-e2236c41-c0cf-477c-9f32-4db7c9e9dc2f req-a8454954-0524-45b3-af3f-5b8661609819 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.924 186180 DEBUG oslo_concurrency.lockutils [req-e2236c41-c0cf-477c-9f32-4db7c9e9dc2f req-a8454954-0524-45b3-af3f-5b8661609819 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.925 186180 DEBUG oslo_concurrency.lockutils [req-e2236c41-c0cf-477c-9f32-4db7c9e9dc2f req-a8454954-0524-45b3-af3f-5b8661609819 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.926 186180 DEBUG oslo_concurrency.lockutils [req-e2236c41-c0cf-477c-9f32-4db7c9e9dc2f req-a8454954-0524-45b3-af3f-5b8661609819 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:42 compute-0 nova_compute[186176]: 2026-02-16 17:44:42.926 186180 DEBUG nova.compute.manager [req-e2236c41-c0cf-477c-9f32-4db7c9e9dc2f req-a8454954-0524-45b3-af3f-5b8661609819 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Processing event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.159 186180 DEBUG nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.161 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263883.1588917, 8fd003a2-24e8-4868-8c36-8795ad9aefd7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.161 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] VM Started (Lifecycle Event)
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.166 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.170 186180 INFO nova.virt.libvirt.driver [-] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Instance spawned successfully.
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.171 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.189 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.196 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.202 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.203 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.204 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.205 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.205 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.206 186180 DEBUG nova.virt.libvirt.driver [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.218 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.219 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263883.1592603, 8fd003a2-24e8-4868-8c36-8795ad9aefd7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.219 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] VM Paused (Lifecycle Event)
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.246 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.250 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263883.165449, 8fd003a2-24e8-4868-8c36-8795ad9aefd7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.251 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] VM Resumed (Lifecycle Event)
Feb 16 17:44:43 compute-0 podman[213935]: 2026-02-16 17:44:43.274136764 +0000 UTC m=+0.059313265 container create 4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.283 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.287 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.308 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.320 186180 INFO nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Took 9.99 seconds to spawn the instance on the hypervisor.
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.321 186180 DEBUG nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:44:43 compute-0 systemd[1]: Started libpod-conmon-4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0.scope.
Feb 16 17:44:43 compute-0 podman[213935]: 2026-02-16 17:44:43.239382792 +0000 UTC m=+0.024559343 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:44:43 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:44:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c44f31b2f9aa2b73f9030a094d9f00d22c5f0908de70d8a2c57af1b46c39decc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:44:43 compute-0 podman[213935]: 2026-02-16 17:44:43.376247358 +0000 UTC m=+0.161423909 container init 4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 17:44:43 compute-0 podman[213935]: 2026-02-16 17:44:43.383978297 +0000 UTC m=+0.169154798 container start 4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.417 186180 INFO nova.compute.manager [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Took 11.06 seconds to build instance.
Feb 16 17:44:43 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213951]: [NOTICE]   (213955) : New worker (213957) forked
Feb 16 17:44:43 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213951]: [NOTICE]   (213955) : Loading success.
Feb 16 17:44:43 compute-0 nova_compute[186176]: 2026-02-16 17:44:43.449 186180 DEBUG oslo_concurrency.lockutils [None req-206a296d-a620-4479-a28c-cadd045cd0d1 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:44 compute-0 nova_compute[186176]: 2026-02-16 17:44:44.515 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:44 compute-0 nova_compute[186176]: 2026-02-16 17:44:44.537 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:45 compute-0 nova_compute[186176]: 2026-02-16 17:44:45.025 186180 DEBUG nova.network.neutron [req-20675996-f8e6-49a1-8b52-41a43b7c8642 req-6b990ba5-8e0a-4387-a82f-e3965fa4fe68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Updated VIF entry in instance network info cache for port ea4ec714-57e6-4a38-9844-3dde4e2e085c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:44:45 compute-0 nova_compute[186176]: 2026-02-16 17:44:45.026 186180 DEBUG nova.network.neutron [req-20675996-f8e6-49a1-8b52-41a43b7c8642 req-6b990ba5-8e0a-4387-a82f-e3965fa4fe68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Updating instance_info_cache with network_info: [{"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:44:45 compute-0 nova_compute[186176]: 2026-02-16 17:44:45.064 186180 DEBUG oslo_concurrency.lockutils [req-20675996-f8e6-49a1-8b52-41a43b7c8642 req-6b990ba5-8e0a-4387-a82f-e3965fa4fe68 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:44:46 compute-0 nova_compute[186176]: 2026-02-16 17:44:46.066 186180 DEBUG nova.compute.manager [req-900a1cf5-47fa-4c4a-a8ff-e8abb6c058d6 req-5fb703a4-68a7-40b0-9e17-974f9cddd25b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:44:46 compute-0 nova_compute[186176]: 2026-02-16 17:44:46.067 186180 DEBUG oslo_concurrency.lockutils [req-900a1cf5-47fa-4c4a-a8ff-e8abb6c058d6 req-5fb703a4-68a7-40b0-9e17-974f9cddd25b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:46 compute-0 nova_compute[186176]: 2026-02-16 17:44:46.067 186180 DEBUG oslo_concurrency.lockutils [req-900a1cf5-47fa-4c4a-a8ff-e8abb6c058d6 req-5fb703a4-68a7-40b0-9e17-974f9cddd25b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:46 compute-0 nova_compute[186176]: 2026-02-16 17:44:46.068 186180 DEBUG oslo_concurrency.lockutils [req-900a1cf5-47fa-4c4a-a8ff-e8abb6c058d6 req-5fb703a4-68a7-40b0-9e17-974f9cddd25b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:46 compute-0 nova_compute[186176]: 2026-02-16 17:44:46.068 186180 DEBUG nova.compute.manager [req-900a1cf5-47fa-4c4a-a8ff-e8abb6c058d6 req-5fb703a4-68a7-40b0-9e17-974f9cddd25b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] No waiting events found dispatching network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:44:46 compute-0 nova_compute[186176]: 2026-02-16 17:44:46.069 186180 WARNING nova.compute.manager [req-900a1cf5-47fa-4c4a-a8ff-e8abb6c058d6 req-5fb703a4-68a7-40b0-9e17-974f9cddd25b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received unexpected event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c for instance with vm_state active and task_state None.
Feb 16 17:44:49 compute-0 nova_compute[186176]: 2026-02-16 17:44:49.319 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:44:49 compute-0 nova_compute[186176]: 2026-02-16 17:44:49.320 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:44:49 compute-0 nova_compute[186176]: 2026-02-16 17:44:49.320 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:44:49 compute-0 nova_compute[186176]: 2026-02-16 17:44:49.518 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:49 compute-0 nova_compute[186176]: 2026-02-16 17:44:49.539 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:49 compute-0 nova_compute[186176]: 2026-02-16 17:44:49.890 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:44:49 compute-0 nova_compute[186176]: 2026-02-16 17:44:49.890 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:44:49 compute-0 nova_compute[186176]: 2026-02-16 17:44:49.890 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:44:49 compute-0 nova_compute[186176]: 2026-02-16 17:44:49.891 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8fd003a2-24e8-4868-8c36-8795ad9aefd7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:44:52 compute-0 nova_compute[186176]: 2026-02-16 17:44:52.913 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Updating instance_info_cache with network_info: [{"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.642 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.643 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.643 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.644 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.676 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.677 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.677 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.677 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.786 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.881 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.883 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:44:53 compute-0 nova_compute[186176]: 2026-02-16 17:44:53.961 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.159 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.161 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5650MB free_disk=73.2219123840332GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.161 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.161 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.356 186180 INFO nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance e32add26-3aff-46b2-a115-3abe3d5fc6e1 has allocations against this compute host but is not found in the database.
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.357 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.357 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.380 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.423 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.424 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.443 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.471 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.519 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.526 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.541 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.547 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.573 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.573 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.891 186180 DEBUG nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Check if temp file /var/lib/nova/instances/tmpds35we4c exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 17:44:54 compute-0 nova_compute[186176]: 2026-02-16 17:44:54.892 186180 DEBUG nova.compute.manager [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpds35we4c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8fd003a2-24e8-4868-8c36-8795ad9aefd7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 17:44:55 compute-0 nova_compute[186176]: 2026-02-16 17:44:55.247 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:44:55 compute-0 nova_compute[186176]: 2026-02-16 17:44:55.247 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:44:55 compute-0 nova_compute[186176]: 2026-02-16 17:44:55.268 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:44:55 compute-0 nova_compute[186176]: 2026-02-16 17:44:55.268 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:44:55 compute-0 nova_compute[186176]: 2026-02-16 17:44:55.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:44:55 compute-0 nova_compute[186176]: 2026-02-16 17:44:55.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:44:55 compute-0 nova_compute[186176]: 2026-02-16 17:44:55.875 186180 DEBUG oslo_concurrency.processutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:44:55 compute-0 nova_compute[186176]: 2026-02-16 17:44:55.939 186180 DEBUG oslo_concurrency.processutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:44:55 compute-0 nova_compute[186176]: 2026-02-16 17:44:55.941 186180 DEBUG oslo_concurrency.processutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:44:55 compute-0 ovn_controller[96437]: 2026-02-16T17:44:55Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a4:4e:f2 10.100.0.10
Feb 16 17:44:55 compute-0 ovn_controller[96437]: 2026-02-16T17:44:55Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a4:4e:f2 10.100.0.10
Feb 16 17:44:56 compute-0 nova_compute[186176]: 2026-02-16 17:44:56.008 186180 DEBUG oslo_concurrency.processutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:44:57 compute-0 nova_compute[186176]: 2026-02-16 17:44:57.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:44:59 compute-0 sshd-session[213991]: Accepted publickey for nova from 192.168.122.101 port 35926 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:44:59 compute-0 nova_compute[186176]: 2026-02-16 17:44:59.521 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:59 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 17:44:59 compute-0 nova_compute[186176]: 2026-02-16 17:44:59.543 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:44:59 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 17:44:59 compute-0 systemd-logind[821]: New session 39 of user nova.
Feb 16 17:44:59 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 17:44:59 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 17:44:59 compute-0 systemd[214009]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:44:59 compute-0 podman[213993]: 2026-02-16 17:44:59.620877031 +0000 UTC m=+0.090374546 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, distribution-scope=public, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, version=9.7, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Feb 16 17:44:59 compute-0 systemd[214009]: Queued start job for default target Main User Target.
Feb 16 17:44:59 compute-0 systemd[214009]: Created slice User Application Slice.
Feb 16 17:44:59 compute-0 systemd[214009]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:44:59 compute-0 systemd[214009]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 17:44:59 compute-0 systemd[214009]: Reached target Paths.
Feb 16 17:44:59 compute-0 systemd[214009]: Reached target Timers.
Feb 16 17:44:59 compute-0 systemd[214009]: Starting D-Bus User Message Bus Socket...
Feb 16 17:44:59 compute-0 systemd[214009]: Starting Create User's Volatile Files and Directories...
Feb 16 17:44:59 compute-0 systemd[214009]: Finished Create User's Volatile Files and Directories.
Feb 16 17:44:59 compute-0 systemd[214009]: Listening on D-Bus User Message Bus Socket.
Feb 16 17:44:59 compute-0 systemd[214009]: Reached target Sockets.
Feb 16 17:44:59 compute-0 systemd[214009]: Reached target Basic System.
Feb 16 17:44:59 compute-0 systemd[214009]: Reached target Main User Target.
Feb 16 17:44:59 compute-0 systemd[214009]: Startup finished in 117ms.
Feb 16 17:44:59 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 17:44:59 compute-0 podman[195505]: time="2026-02-16T17:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:44:59 compute-0 systemd[1]: Started Session 39 of User nova.
Feb 16 17:44:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:44:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 17:44:59 compute-0 sshd-session[213991]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:44:59 compute-0 sshd-session[214032]: Received disconnect from 192.168.122.101 port 35926:11: disconnected by user
Feb 16 17:44:59 compute-0 sshd-session[214032]: Disconnected from user nova 192.168.122.101 port 35926
Feb 16 17:44:59 compute-0 sshd-session[213991]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:44:59 compute-0 systemd[1]: session-39.scope: Deactivated successfully.
Feb 16 17:44:59 compute-0 systemd-logind[821]: Session 39 logged out. Waiting for processes to exit.
Feb 16 17:44:59 compute-0 systemd-logind[821]: Removed session 39.
Feb 16 17:45:01 compute-0 openstack_network_exporter[198360]: ERROR   17:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:45:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:45:01 compute-0 openstack_network_exporter[198360]: ERROR   17:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:45:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:45:02 compute-0 podman[214034]: 2026-02-16 17:45:02.105782246 +0000 UTC m=+0.073000571 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 17:45:02 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:02.345 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:45:02 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:02.347 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:45:02 compute-0 nova_compute[186176]: 2026-02-16 17:45:02.379 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:02 compute-0 nova_compute[186176]: 2026-02-16 17:45:02.409 186180 DEBUG nova.compute.manager [req-b2a01424-293b-42f7-969a-f650ab0eab33 req-61fd100c-5837-48d8-9c6a-9de2e33c7c0e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-unplugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:45:02 compute-0 nova_compute[186176]: 2026-02-16 17:45:02.409 186180 DEBUG oslo_concurrency.lockutils [req-b2a01424-293b-42f7-969a-f650ab0eab33 req-61fd100c-5837-48d8-9c6a-9de2e33c7c0e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:02 compute-0 nova_compute[186176]: 2026-02-16 17:45:02.410 186180 DEBUG oslo_concurrency.lockutils [req-b2a01424-293b-42f7-969a-f650ab0eab33 req-61fd100c-5837-48d8-9c6a-9de2e33c7c0e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:02 compute-0 nova_compute[186176]: 2026-02-16 17:45:02.410 186180 DEBUG oslo_concurrency.lockutils [req-b2a01424-293b-42f7-969a-f650ab0eab33 req-61fd100c-5837-48d8-9c6a-9de2e33c7c0e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:02 compute-0 nova_compute[186176]: 2026-02-16 17:45:02.410 186180 DEBUG nova.compute.manager [req-b2a01424-293b-42f7-969a-f650ab0eab33 req-61fd100c-5837-48d8-9c6a-9de2e33c7c0e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] No waiting events found dispatching network-vif-unplugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:45:02 compute-0 nova_compute[186176]: 2026-02-16 17:45:02.411 186180 DEBUG nova.compute.manager [req-b2a01424-293b-42f7-969a-f650ab0eab33 req-61fd100c-5837-48d8-9c6a-9de2e33c7c0e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-unplugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.357 186180 INFO nova.compute.manager [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Took 7.35 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.358 186180 DEBUG nova.compute.manager [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.382 186180 DEBUG nova.compute.manager [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpds35we4c',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8fd003a2-24e8-4868-8c36-8795ad9aefd7',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(e32add26-3aff-46b2-a115-3abe3d5fc6e1),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.421 186180 DEBUG nova.objects.instance [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lazy-loading 'migration_context' on Instance uuid 8fd003a2-24e8-4868-8c36-8795ad9aefd7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.423 186180 DEBUG nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.426 186180 DEBUG nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.426 186180 DEBUG nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.457 186180 DEBUG nova.virt.libvirt.vif [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1847915730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1847915730',id=22,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:44:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-yb2z0gbq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:44:43Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=8fd003a2-24e8-4868-8c36-8795ad9aefd7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.458 186180 DEBUG nova.network.os_vif_util [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Converting VIF {"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.459 186180 DEBUG nova.network.os_vif_util [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:4e:f2,bridge_name='br-int',has_traffic_filtering=True,id=ea4ec714-57e6-4a38-9844-3dde4e2e085c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea4ec714-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.460 186180 DEBUG nova.virt.libvirt.migration [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 17:45:03 compute-0 nova_compute[186176]:   <mac address="fa:16:3e:a4:4e:f2"/>
Feb 16 17:45:03 compute-0 nova_compute[186176]:   <model type="virtio"/>
Feb 16 17:45:03 compute-0 nova_compute[186176]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:45:03 compute-0 nova_compute[186176]:   <mtu size="1442"/>
Feb 16 17:45:03 compute-0 nova_compute[186176]:   <target dev="tapea4ec714-57"/>
Feb 16 17:45:03 compute-0 nova_compute[186176]: </interface>
Feb 16 17:45:03 compute-0 nova_compute[186176]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.461 186180 DEBUG nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.929 186180 DEBUG nova.virt.libvirt.migration [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:45:03 compute-0 nova_compute[186176]: 2026-02-16 17:45:03.929 186180 INFO nova.virt.libvirt.migration [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.018 186180 INFO nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.522 186180 DEBUG nova.compute.manager [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.523 186180 DEBUG oslo_concurrency.lockutils [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.523 186180 DEBUG oslo_concurrency.lockutils [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.523 186180 DEBUG oslo_concurrency.lockutils [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.523 186180 DEBUG nova.compute.manager [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] No waiting events found dispatching network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.524 186180 WARNING nova.compute.manager [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received unexpected event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c for instance with vm_state active and task_state migrating.
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.524 186180 DEBUG nova.compute.manager [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-changed-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.524 186180 DEBUG nova.compute.manager [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Refreshing instance network info cache due to event network-changed-ea4ec714-57e6-4a38-9844-3dde4e2e085c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.524 186180 DEBUG oslo_concurrency.lockutils [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.524 186180 DEBUG oslo_concurrency.lockutils [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.524 186180 DEBUG nova.network.neutron [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Refreshing network info cache for port ea4ec714-57e6-4a38-9844-3dde4e2e085c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.561 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.567 186180 DEBUG nova.virt.libvirt.migration [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:45:04 compute-0 nova_compute[186176]: 2026-02-16 17:45:04.567 186180 DEBUG nova.virt.libvirt.migration [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.004 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263905.0036805, 8fd003a2-24e8-4868-8c36-8795ad9aefd7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.005 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] VM Paused (Lifecycle Event)
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.055 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.060 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.071 186180 DEBUG nova.virt.libvirt.migration [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.071 186180 DEBUG nova.virt.libvirt.migration [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.097 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 17:45:05 compute-0 kernel: tapea4ec714-57 (unregistering): left promiscuous mode
Feb 16 17:45:05 compute-0 NetworkManager[56463]: <info>  [1771263905.1636] device (tapea4ec714-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.172 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:05 compute-0 ovn_controller[96437]: 2026-02-16T17:45:05Z|00177|binding|INFO|Releasing lport ea4ec714-57e6-4a38-9844-3dde4e2e085c from this chassis (sb_readonly=0)
Feb 16 17:45:05 compute-0 ovn_controller[96437]: 2026-02-16T17:45:05Z|00178|binding|INFO|Setting lport ea4ec714-57e6-4a38-9844-3dde4e2e085c down in Southbound
Feb 16 17:45:05 compute-0 ovn_controller[96437]: 2026-02-16T17:45:05Z|00179|binding|INFO|Removing iface tapea4ec714-57 ovn-installed in OVS
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.176 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.186 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.192 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:4e:f2 10.100.0.10'], port_security=['fa:16:3e:a4:4e:f2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8fd003a2-24e8-4868-8c36-8795ad9aefd7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=ea4ec714-57e6-4a38-9844-3dde4e2e085c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.194 105730 INFO neutron.agent.ovn.metadata.agent [-] Port ea4ec714-57e6-4a38-9844-3dde4e2e085c in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.197 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.199 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e771bdac-329d-4115-b534-716bbc626e4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.201 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace which is not needed anymore
Feb 16 17:45:05 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000016.scope: Deactivated successfully.
Feb 16 17:45:05 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000016.scope: Consumed 12.987s CPU time.
Feb 16 17:45:05 compute-0 systemd-machined[155631]: Machine qemu-17-instance-00000016 terminated.
Feb 16 17:45:05 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213951]: [NOTICE]   (213955) : haproxy version is 2.8.14-c23fe91
Feb 16 17:45:05 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213951]: [NOTICE]   (213955) : path to executable is /usr/sbin/haproxy
Feb 16 17:45:05 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213951]: [WARNING]  (213955) : Exiting Master process...
Feb 16 17:45:05 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213951]: [ALERT]    (213955) : Current worker (213957) exited with code 143 (Terminated)
Feb 16 17:45:05 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[213951]: [WARNING]  (213955) : All workers exited. Exiting... (0)
Feb 16 17:45:05 compute-0 systemd[1]: libpod-4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0.scope: Deactivated successfully.
Feb 16 17:45:05 compute-0 podman[214083]: 2026-02-16 17:45:05.33833152 +0000 UTC m=+0.044413340 container died 4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:45:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0-userdata-shm.mount: Deactivated successfully.
Feb 16 17:45:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-c44f31b2f9aa2b73f9030a094d9f00d22c5f0908de70d8a2c57af1b46c39decc-merged.mount: Deactivated successfully.
Feb 16 17:45:05 compute-0 podman[214083]: 2026-02-16 17:45:05.38522552 +0000 UTC m=+0.091307360 container cleanup 4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 16 17:45:05 compute-0 systemd[1]: libpod-conmon-4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0.scope: Deactivated successfully.
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.396 186180 DEBUG nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.396 186180 DEBUG nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.396 186180 DEBUG nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 17:45:05 compute-0 podman[214130]: 2026-02-16 17:45:05.453702818 +0000 UTC m=+0.045435325 container remove 4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.458 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[8b33fbaf-2da7-4ff4-a8a0-79f8ee684ce9]: (4, ('Mon Feb 16 05:45:05 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0)\n4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0\nMon Feb 16 05:45:05 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0)\n4c75256b48d4aeeb6a0ba1e9effc623606ae9abb63ce8403bafabedfdce68ca0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.460 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[a2cb561e-3176-4988-9da9-14e30386061e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.461 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:45:05 compute-0 kernel: tap94cafcd0-c0: left promiscuous mode
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.464 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.473 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.478 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[7c342937-4d82-4d30-9cea-69398656c6c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.492 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[4b0ca10a-f0c9-46f3-872f-a6d2f82fec17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.495 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[cb96194c-bdc6-482d-9d67-cd313b935787]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.512 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ae643e3e-e2a2-4df3-847c-9fac4345e688]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547813, 'reachable_time': 25113, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214149, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:45:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d94cafcd0\x2dc7c2\x2d48b4\x2da2dd\x2d21c16ce48dc4.mount: Deactivated successfully.
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.519 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:45:05 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:05.519 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[5585c440-dd95-4a90-ae1b-97163d97aa7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.574 186180 DEBUG nova.virt.libvirt.guest [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '8fd003a2-24e8-4868-8c36-8795ad9aefd7' (instance-00000016) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.575 186180 INFO nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Migration operation has completed
Feb 16 17:45:05 compute-0 nova_compute[186176]: 2026-02-16 17:45:05.575 186180 INFO nova.compute.manager [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] _post_live_migration() is started..
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.417 186180 DEBUG nova.network.neutron [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Updated VIF entry in instance network info cache for port ea4ec714-57e6-4a38-9844-3dde4e2e085c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.418 186180 DEBUG nova.network.neutron [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Updating instance_info_cache with network_info: [{"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.443 186180 DEBUG oslo_concurrency.lockutils [req-c1757490-2d5c-4a6e-a3d7-4a02539ea1a9 req-b5713a5a-2793-44b0-a311-15cb0ef49147 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-8fd003a2-24e8-4868-8c36-8795ad9aefd7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.679 186180 DEBUG nova.compute.manager [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-unplugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.679 186180 DEBUG oslo_concurrency.lockutils [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.680 186180 DEBUG oslo_concurrency.lockutils [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.680 186180 DEBUG oslo_concurrency.lockutils [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.680 186180 DEBUG nova.compute.manager [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] No waiting events found dispatching network-vif-unplugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.681 186180 DEBUG nova.compute.manager [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-unplugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.681 186180 DEBUG nova.compute.manager [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.681 186180 DEBUG oslo_concurrency.lockutils [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.681 186180 DEBUG oslo_concurrency.lockutils [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.682 186180 DEBUG oslo_concurrency.lockutils [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.682 186180 DEBUG nova.compute.manager [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] No waiting events found dispatching network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:45:06 compute-0 nova_compute[186176]: 2026-02-16 17:45:06.682 186180 WARNING nova.compute.manager [req-db07ee7c-a194-42d1-8ada-258f4814bff6 req-7d2d03be-6aa2-4b1d-aad9-c93dffd086d9 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received unexpected event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c for instance with vm_state active and task_state migrating.
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.089 186180 DEBUG nova.network.neutron [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Activated binding for port ea4ec714-57e6-4a38-9844-3dde4e2e085c and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.090 186180 DEBUG nova.compute.manager [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.091 186180 DEBUG nova.virt.libvirt.vif [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:44:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1847915730',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1847915730',id=22,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:44:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-yb2z0gbq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:44:49Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=8fd003a2-24e8-4868-8c36-8795ad9aefd7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.092 186180 DEBUG nova.network.os_vif_util [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Converting VIF {"id": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "address": "fa:16:3e:a4:4e:f2", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea4ec714-57", "ovs_interfaceid": "ea4ec714-57e6-4a38-9844-3dde4e2e085c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.093 186180 DEBUG nova.network.os_vif_util [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:4e:f2,bridge_name='br-int',has_traffic_filtering=True,id=ea4ec714-57e6-4a38-9844-3dde4e2e085c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea4ec714-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.094 186180 DEBUG os_vif [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:4e:f2,bridge_name='br-int',has_traffic_filtering=True,id=ea4ec714-57e6-4a38-9844-3dde4e2e085c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea4ec714-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.096 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.097 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea4ec714-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.099 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.102 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.108 186180 INFO os_vif [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:4e:f2,bridge_name='br-int',has_traffic_filtering=True,id=ea4ec714-57e6-4a38-9844-3dde4e2e085c,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea4ec714-57')
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.109 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.110 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.110 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.111 186180 DEBUG nova.compute.manager [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.112 186180 INFO nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Deleting instance files /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7_del
Feb 16 17:45:07 compute-0 nova_compute[186176]: 2026-02-16 17:45:07.113 186180 INFO nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Deletion of /var/lib/nova/instances/8fd003a2-24e8-4868-8c36-8795ad9aefd7_del complete
Feb 16 17:45:08 compute-0 podman[214151]: 2026-02-16 17:45:08.121077728 +0000 UTC m=+0.070109740 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:45:08 compute-0 podman[214150]: 2026-02-16 17:45:08.196776404 +0000 UTC m=+0.151104806 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.795 186180 DEBUG nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.797 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.797 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.798 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.798 186180 DEBUG nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] No waiting events found dispatching network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.798 186180 WARNING nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received unexpected event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c for instance with vm_state active and task_state migrating.
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.799 186180 DEBUG nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-unplugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.799 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.800 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.800 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.801 186180 DEBUG nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] No waiting events found dispatching network-vif-unplugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.801 186180 DEBUG nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-unplugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.802 186180 DEBUG nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.802 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.803 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.803 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.804 186180 DEBUG nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] No waiting events found dispatching network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.804 186180 WARNING nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received unexpected event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c for instance with vm_state active and task_state migrating.
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.804 186180 DEBUG nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.805 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.805 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.806 186180 DEBUG oslo_concurrency.lockutils [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.806 186180 DEBUG nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] No waiting events found dispatching network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:45:08 compute-0 nova_compute[186176]: 2026-02-16 17:45:08.807 186180 WARNING nova.compute.manager [req-1a8434fe-16d9-4540-93cb-b0dfd20e8f62 req-ba90c32b-4917-4070-8241-240d4489fe17 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Received unexpected event network-vif-plugged-ea4ec714-57e6-4a38-9844-3dde4e2e085c for instance with vm_state active and task_state migrating.
Feb 16 17:45:09 compute-0 nova_compute[186176]: 2026-02-16 17:45:09.563 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:10 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 17:45:10 compute-0 systemd[214009]: Activating special unit Exit the Session...
Feb 16 17:45:10 compute-0 systemd[214009]: Stopped target Main User Target.
Feb 16 17:45:10 compute-0 systemd[214009]: Stopped target Basic System.
Feb 16 17:45:10 compute-0 systemd[214009]: Stopped target Paths.
Feb 16 17:45:10 compute-0 systemd[214009]: Stopped target Sockets.
Feb 16 17:45:10 compute-0 systemd[214009]: Stopped target Timers.
Feb 16 17:45:10 compute-0 systemd[214009]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:45:10 compute-0 systemd[214009]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 17:45:10 compute-0 systemd[214009]: Closed D-Bus User Message Bus Socket.
Feb 16 17:45:10 compute-0 systemd[214009]: Stopped Create User's Volatile Files and Directories.
Feb 16 17:45:10 compute-0 systemd[214009]: Removed slice User Application Slice.
Feb 16 17:45:10 compute-0 systemd[214009]: Reached target Shutdown.
Feb 16 17:45:10 compute-0 systemd[214009]: Finished Exit the Session.
Feb 16 17:45:10 compute-0 systemd[214009]: Reached target Exit the Session.
Feb 16 17:45:10 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 17:45:10 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 17:45:10 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 17:45:10 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 17:45:10 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 17:45:10 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 17:45:10 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 17:45:12 compute-0 nova_compute[186176]: 2026-02-16 17:45:12.100 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:12 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:12.349 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:45:12 compute-0 ovn_controller[96437]: 2026-02-16T17:45:12Z|00180|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.214 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Acquiring lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.215 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.216 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "8fd003a2-24e8-4868-8c36-8795ad9aefd7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.264 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.265 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.266 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.266 186180 DEBUG nova.compute.resource_tracker [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.480 186180 WARNING nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.483 186180 DEBUG nova.compute.resource_tracker [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5790MB free_disk=73.22379684448242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.484 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.484 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.533 186180 DEBUG nova.compute.resource_tracker [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Migration for instance 8fd003a2-24e8-4868-8c36-8795ad9aefd7 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.558 186180 DEBUG nova.compute.resource_tracker [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.596 186180 DEBUG nova.compute.resource_tracker [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Migration e32add26-3aff-46b2-a115-3abe3d5fc6e1 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.597 186180 DEBUG nova.compute.resource_tracker [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.597 186180 DEBUG nova.compute.resource_tracker [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.682 186180 DEBUG nova.compute.provider_tree [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.704 186180 DEBUG nova.scheduler.client.report [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.733 186180 DEBUG nova.compute.resource_tracker [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.734 186180 DEBUG oslo_concurrency.lockutils [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.739 186180 INFO nova.compute.manager [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.939 186180 INFO nova.scheduler.client.report [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Deleted allocation for migration e32add26-3aff-46b2-a115-3abe3d5fc6e1
Feb 16 17:45:13 compute-0 nova_compute[186176]: 2026-02-16 17:45:13.939 186180 DEBUG nova.virt.libvirt.driver [None req-ea385826-60a8-4b75-965d-784a63f8e49f 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 17:45:14 compute-0 nova_compute[186176]: 2026-02-16 17:45:14.588 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:17 compute-0 nova_compute[186176]: 2026-02-16 17:45:17.103 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:19 compute-0 nova_compute[186176]: 2026-02-16 17:45:19.591 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:20 compute-0 nova_compute[186176]: 2026-02-16 17:45:20.395 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771263905.3931022, 8fd003a2-24e8-4868-8c36-8795ad9aefd7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:45:20 compute-0 nova_compute[186176]: 2026-02-16 17:45:20.395 186180 INFO nova.compute.manager [-] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] VM Stopped (Lifecycle Event)
Feb 16 17:45:20 compute-0 nova_compute[186176]: 2026-02-16 17:45:20.418 186180 DEBUG nova.compute.manager [None req-be8b59f2-545e-475e-8b45-02acd965f841 - - - - - -] [instance: 8fd003a2-24e8-4868-8c36-8795ad9aefd7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:45:22 compute-0 nova_compute[186176]: 2026-02-16 17:45:22.105 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:24 compute-0 nova_compute[186176]: 2026-02-16 17:45:24.628 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:27 compute-0 nova_compute[186176]: 2026-02-16 17:45:27.109 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:29 compute-0 nova_compute[186176]: 2026-02-16 17:45:29.629 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:29 compute-0 podman[195505]: time="2026-02-16T17:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:45:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:45:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 16 17:45:30 compute-0 podman[214203]: 2026-02-16 17:45:30.095872267 +0000 UTC m=+0.061036678 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1770267347, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, architecture=x86_64)
Feb 16 17:45:31 compute-0 openstack_network_exporter[198360]: ERROR   17:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:45:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:45:31 compute-0 openstack_network_exporter[198360]: ERROR   17:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:45:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:45:32 compute-0 nova_compute[186176]: 2026-02-16 17:45:32.111 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:33 compute-0 podman[214224]: 2026-02-16 17:45:33.109086152 +0000 UTC m=+0.081817857 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 16 17:45:34 compute-0 nova_compute[186176]: 2026-02-16 17:45:34.632 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:37 compute-0 nova_compute[186176]: 2026-02-16 17:45:37.157 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:37 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 17:45:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:38.178 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:38.178 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:45:38.178 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:39 compute-0 podman[214245]: 2026-02-16 17:45:39.104989878 +0000 UTC m=+0.070421087 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:45:39 compute-0 podman[214244]: 2026-02-16 17:45:39.126924446 +0000 UTC m=+0.095327008 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 17:45:39 compute-0 nova_compute[186176]: 2026-02-16 17:45:39.634 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:42 compute-0 nova_compute[186176]: 2026-02-16 17:45:42.160 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:44 compute-0 nova_compute[186176]: 2026-02-16 17:45:44.634 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:47 compute-0 nova_compute[186176]: 2026-02-16 17:45:47.162 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:49 compute-0 nova_compute[186176]: 2026-02-16 17:45:49.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:45:49 compute-0 nova_compute[186176]: 2026-02-16 17:45:49.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:45:49 compute-0 nova_compute[186176]: 2026-02-16 17:45:49.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:45:49 compute-0 nova_compute[186176]: 2026-02-16 17:45:49.336 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:45:49 compute-0 nova_compute[186176]: 2026-02-16 17:45:49.636 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:50 compute-0 nova_compute[186176]: 2026-02-16 17:45:50.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.352 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.352 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.352 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.352 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.515 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.516 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5824MB free_disk=73.2237777709961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.516 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.516 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.582 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.583 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.608 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.623 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.625 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:45:51 compute-0 nova_compute[186176]: 2026-02-16 17:45:51.625 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:45:52 compute-0 nova_compute[186176]: 2026-02-16 17:45:52.164 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:52 compute-0 ovn_controller[96437]: 2026-02-16T17:45:52Z|00181|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Feb 16 17:45:54 compute-0 nova_compute[186176]: 2026-02-16 17:45:54.620 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:45:54 compute-0 nova_compute[186176]: 2026-02-16 17:45:54.621 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:45:54 compute-0 nova_compute[186176]: 2026-02-16 17:45:54.621 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:45:54 compute-0 nova_compute[186176]: 2026-02-16 17:45:54.637 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:55 compute-0 nova_compute[186176]: 2026-02-16 17:45:55.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:45:55 compute-0 nova_compute[186176]: 2026-02-16 17:45:55.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:45:57 compute-0 nova_compute[186176]: 2026-02-16 17:45:57.166 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:59 compute-0 nova_compute[186176]: 2026-02-16 17:45:59.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:45:59 compute-0 nova_compute[186176]: 2026-02-16 17:45:59.639 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:45:59 compute-0 podman[195505]: time="2026-02-16T17:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:45:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:45:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 17:46:01 compute-0 podman[214296]: 2026-02-16 17:46:01.10396029 +0000 UTC m=+0.065360663 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 17:46:01 compute-0 openstack_network_exporter[198360]: ERROR   17:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:46:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:46:01 compute-0 openstack_network_exporter[198360]: ERROR   17:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:46:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:46:02 compute-0 nova_compute[186176]: 2026-02-16 17:46:02.168 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:04 compute-0 podman[214317]: 2026-02-16 17:46:04.068659829 +0000 UTC m=+0.043836306 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 17:46:04 compute-0 nova_compute[186176]: 2026-02-16 17:46:04.642 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:05 compute-0 nova_compute[186176]: 2026-02-16 17:46:05.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:07 compute-0 nova_compute[186176]: 2026-02-16 17:46:07.169 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:09 compute-0 nova_compute[186176]: 2026-02-16 17:46:09.645 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:10 compute-0 podman[214336]: 2026-02-16 17:46:10.10110605 +0000 UTC m=+0.055702146 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:46:10 compute-0 podman[214335]: 2026-02-16 17:46:10.143068959 +0000 UTC m=+0.095434341 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:46:12 compute-0 nova_compute[186176]: 2026-02-16 17:46:12.171 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:14 compute-0 nova_compute[186176]: 2026-02-16 17:46:14.693 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:14 compute-0 nova_compute[186176]: 2026-02-16 17:46:14.786 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:14 compute-0 nova_compute[186176]: 2026-02-16 17:46:14.786 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:14 compute-0 nova_compute[186176]: 2026-02-16 17:46:14.817 186180 DEBUG nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:46:14 compute-0 nova_compute[186176]: 2026-02-16 17:46:14.889 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:14 compute-0 nova_compute[186176]: 2026-02-16 17:46:14.889 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:14 compute-0 nova_compute[186176]: 2026-02-16 17:46:14.897 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:46:14 compute-0 nova_compute[186176]: 2026-02-16 17:46:14.897 186180 INFO nova.compute.claims [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:46:14 compute-0 nova_compute[186176]: 2026-02-16 17:46:14.983 186180 DEBUG nova.compute.provider_tree [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:46:14 compute-0 nova_compute[186176]: 2026-02-16 17:46:14.996 186180 DEBUG nova.scheduler.client.report [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.024 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.025 186180 DEBUG nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.076 186180 DEBUG nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.076 186180 DEBUG nova.network.neutron [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.097 186180 INFO nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.136 186180 DEBUG nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.241 186180 DEBUG nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.242 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.242 186180 INFO nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Creating image(s)
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.243 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "/var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.243 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.243 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "/var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.254 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.331 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.332 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.333 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.348 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.410 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.411 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.444 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.445 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.446 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.501 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.502 186180 DEBUG nova.virt.disk.api [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Checking if we can resize image /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.502 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.550 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.551 186180 DEBUG nova.virt.disk.api [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Cannot resize image /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.551 186180 DEBUG nova.objects.instance [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'migration_context' on Instance uuid 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.568 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.568 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Ensure instance console log exists: /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.569 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.569 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:15 compute-0 nova_compute[186176]: 2026-02-16 17:46:15.570 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:16 compute-0 nova_compute[186176]: 2026-02-16 17:46:16.069 186180 DEBUG nova.policy [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c54934f49b2044289bcf127662fe114b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:46:17 compute-0 nova_compute[186176]: 2026-02-16 17:46:17.220 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:18 compute-0 nova_compute[186176]: 2026-02-16 17:46:18.121 186180 DEBUG nova.network.neutron [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Successfully created port: 694c06a2-2c68-418e-91aa-8541c7793afa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:46:19 compute-0 nova_compute[186176]: 2026-02-16 17:46:19.696 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:19 compute-0 nova_compute[186176]: 2026-02-16 17:46:19.724 186180 DEBUG nova.network.neutron [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Successfully updated port: 694c06a2-2c68-418e-91aa-8541c7793afa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:46:19 compute-0 nova_compute[186176]: 2026-02-16 17:46:19.741 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:46:19 compute-0 nova_compute[186176]: 2026-02-16 17:46:19.742 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquired lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:46:19 compute-0 nova_compute[186176]: 2026-02-16 17:46:19.742 186180 DEBUG nova.network.neutron [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:46:19 compute-0 nova_compute[186176]: 2026-02-16 17:46:19.825 186180 DEBUG nova.compute.manager [req-ff024a37-0b6b-43af-b8d2-ecc6cc165f1e req-e7e466dd-946a-4865-b0e5-66df677b1547 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-changed-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:46:19 compute-0 nova_compute[186176]: 2026-02-16 17:46:19.826 186180 DEBUG nova.compute.manager [req-ff024a37-0b6b-43af-b8d2-ecc6cc165f1e req-e7e466dd-946a-4865-b0e5-66df677b1547 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Refreshing instance network info cache due to event network-changed-694c06a2-2c68-418e-91aa-8541c7793afa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:46:19 compute-0 nova_compute[186176]: 2026-02-16 17:46:19.826 186180 DEBUG oslo_concurrency.lockutils [req-ff024a37-0b6b-43af-b8d2-ecc6cc165f1e req-e7e466dd-946a-4865-b0e5-66df677b1547 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:46:19 compute-0 nova_compute[186176]: 2026-02-16 17:46:19.928 186180 DEBUG nova.network.neutron [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.376 186180 DEBUG nova.network.neutron [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Updating instance_info_cache with network_info: [{"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.397 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Releasing lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.398 186180 DEBUG nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Instance network_info: |[{"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.399 186180 DEBUG oslo_concurrency.lockutils [req-ff024a37-0b6b-43af-b8d2-ecc6cc165f1e req-e7e466dd-946a-4865-b0e5-66df677b1547 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.401 186180 DEBUG nova.network.neutron [req-ff024a37-0b6b-43af-b8d2-ecc6cc165f1e req-e7e466dd-946a-4865-b0e5-66df677b1547 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Refreshing network info cache for port 694c06a2-2c68-418e-91aa-8541c7793afa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.406 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Start _get_guest_xml network_info=[{"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.413 186180 WARNING nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.423 186180 DEBUG nova.virt.libvirt.host [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.424 186180 DEBUG nova.virt.libvirt.host [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.427 186180 DEBUG nova.virt.libvirt.host [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.427 186180 DEBUG nova.virt.libvirt.host [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.429 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.429 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.430 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.430 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.430 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.430 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.430 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.431 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.431 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.431 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.431 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.431 186180 DEBUG nova.virt.hardware [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.434 186180 DEBUG nova.virt.libvirt.vif [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1989966983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1989966983',id=23,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-9gbbdgo6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:46:15Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=6b97d70c-fca4-4b8e-8381-e32928f2a1f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.435 186180 DEBUG nova.network.os_vif_util [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.435 186180 DEBUG nova.network.os_vif_util [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:54:43,bridge_name='br-int',has_traffic_filtering=True,id=694c06a2-2c68-418e-91aa-8541c7793afa,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap694c06a2-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.436 186180 DEBUG nova.objects.instance [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lazy-loading 'pci_devices' on Instance uuid 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.450 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <uuid>6b97d70c-fca4-4b8e-8381-e32928f2a1f0</uuid>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <name>instance-00000017</name>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteStrategies-server-1989966983</nova:name>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:46:21</nova:creationTime>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:46:21 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:46:21 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:46:21 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:46:21 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:46:21 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:46:21 compute-0 nova_compute[186176]:         <nova:user uuid="c54934f49b2044289bcf127662fe114b">tempest-TestExecuteStrategies-1098930400-project-member</nova:user>
Feb 16 17:46:21 compute-0 nova_compute[186176]:         <nova:project uuid="1a237c4b00c5426cb1dc6afe3c7c868c">tempest-TestExecuteStrategies-1098930400</nova:project>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:46:21 compute-0 nova_compute[186176]:         <nova:port uuid="694c06a2-2c68-418e-91aa-8541c7793afa">
Feb 16 17:46:21 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <system>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <entry name="serial">6b97d70c-fca4-4b8e-8381-e32928f2a1f0</entry>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <entry name="uuid">6b97d70c-fca4-4b8e-8381-e32928f2a1f0</entry>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     </system>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <os>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   </os>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <features>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   </features>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk.config"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:33:54:43"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <target dev="tap694c06a2-2c"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/console.log" append="off"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <video>
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     </video>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:46:21 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:46:21 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:46:21 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:46:21 compute-0 nova_compute[186176]: </domain>
Feb 16 17:46:21 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.451 186180 DEBUG nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Preparing to wait for external event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.451 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.451 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.451 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.452 186180 DEBUG nova.virt.libvirt.vif [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1989966983',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1989966983',id=23,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-9gbbdgo6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:46:15Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=6b97d70c-fca4-4b8e-8381-e32928f2a1f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.452 186180 DEBUG nova.network.os_vif_util [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converting VIF {"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.453 186180 DEBUG nova.network.os_vif_util [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:33:54:43,bridge_name='br-int',has_traffic_filtering=True,id=694c06a2-2c68-418e-91aa-8541c7793afa,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap694c06a2-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.453 186180 DEBUG os_vif [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:54:43,bridge_name='br-int',has_traffic_filtering=True,id=694c06a2-2c68-418e-91aa-8541c7793afa,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap694c06a2-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.453 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.454 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.454 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.456 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.456 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap694c06a2-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.456 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap694c06a2-2c, col_values=(('external_ids', {'iface-id': '694c06a2-2c68-418e-91aa-8541c7793afa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:33:54:43', 'vm-uuid': '6b97d70c-fca4-4b8e-8381-e32928f2a1f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.458 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:21 compute-0 NetworkManager[56463]: <info>  [1771263981.4606] manager: (tap694c06a2-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.461 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.466 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.467 186180 INFO os_vif [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:33:54:43,bridge_name='br-int',has_traffic_filtering=True,id=694c06a2-2c68-418e-91aa-8541c7793afa,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap694c06a2-2c')
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.524 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.524 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.525 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] No VIF found with MAC fa:16:3e:33:54:43, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:46:21 compute-0 nova_compute[186176]: 2026-02-16 17:46:21.525 186180 INFO nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Using config drive
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.171 186180 INFO nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Creating config drive at /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk.config
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.175 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpevx59jfg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.301 186180 DEBUG oslo_concurrency.processutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpevx59jfg" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:46:22 compute-0 kernel: tap694c06a2-2c: entered promiscuous mode
Feb 16 17:46:22 compute-0 NetworkManager[56463]: <info>  [1771263982.3647] manager: (tap694c06a2-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Feb 16 17:46:22 compute-0 ovn_controller[96437]: 2026-02-16T17:46:22Z|00182|binding|INFO|Claiming lport 694c06a2-2c68-418e-91aa-8541c7793afa for this chassis.
Feb 16 17:46:22 compute-0 ovn_controller[96437]: 2026-02-16T17:46:22Z|00183|binding|INFO|694c06a2-2c68-418e-91aa-8541c7793afa: Claiming fa:16:3e:33:54:43 10.100.0.3
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.365 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.374 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:54:43 10.100.0.3'], port_security=['fa:16:3e:33:54:43 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6b97d70c-fca4-4b8e-8381-e32928f2a1f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=694c06a2-2c68-418e-91aa-8541c7793afa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.375 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 694c06a2-2c68-418e-91aa-8541c7793afa in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 bound to our chassis
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.376 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.377 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:46:22 compute-0 ovn_controller[96437]: 2026-02-16T17:46:22Z|00184|binding|INFO|Setting lport 694c06a2-2c68-418e-91aa-8541c7793afa ovn-installed in OVS
Feb 16 17:46:22 compute-0 ovn_controller[96437]: 2026-02-16T17:46:22Z|00185|binding|INFO|Setting lport 694c06a2-2c68-418e-91aa-8541c7793afa up in Southbound
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.382 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.389 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3e04efbf-a8cd-488b-b67c-3510037f313f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.391 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap94cafcd0-c1 in ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.397 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap94cafcd0-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:46:22 compute-0 systemd-udevd[214417]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.397 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[47ed1236-2f89-4665-9ee0-ec5f4ada1921]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.398 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[dafe03a8-3ba3-412b-a6e8-9b134a69a8e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 NetworkManager[56463]: <info>  [1771263982.4135] device (tap694c06a2-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:46:22 compute-0 NetworkManager[56463]: <info>  [1771263982.4143] device (tap694c06a2-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.417 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[86b9e76c-9922-4bc7-8488-363c56cec56c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 systemd-machined[155631]: New machine qemu-18-instance-00000017.
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.431 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[0d158a60-bbae-48a2-a816-0d910ff0a1f8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000017.
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.462 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[a0beb461-e5c5-43f6-8ae8-fd1fd22cd647]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.466 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[a29d5239-f9f9-409c-a5da-a68befb80ed5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 NetworkManager[56463]: <info>  [1771263982.4679] manager: (tap94cafcd0-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.495 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1c4702-2090-4859-a349-cf511eec4951]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.500 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[9abb2e5b-1646-42b0-9ecd-e809195ad785]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 NetworkManager[56463]: <info>  [1771263982.5188] device (tap94cafcd0-c0): carrier: link connected
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.521 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb78789-c0f4-4cf4-95b9-b436d873f74f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.539 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[6187aa33-22c1-45df-885e-af47267d136d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557804, 'reachable_time': 43605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214452, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.554 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[23c08ef4-e6fa-41b3-b343-7cc336ca5e14]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:6951'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 557804, 'tstamp': 557804}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214453, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.573 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7bb398-365b-483c-9592-371dcd0221ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap94cafcd0-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:69:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557804, 'reachable_time': 43605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214454, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.603 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c534f6-e379-4ab5-bdde-a68fcde123c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.632 186180 DEBUG nova.compute.manager [req-b9d7f07e-9f08-419c-bd05-a157ad336748 req-3305ea7e-1df4-4b37-8101-8d0eeac0e758 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.633 186180 DEBUG oslo_concurrency.lockutils [req-b9d7f07e-9f08-419c-bd05-a157ad336748 req-3305ea7e-1df4-4b37-8101-8d0eeac0e758 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.634 186180 DEBUG oslo_concurrency.lockutils [req-b9d7f07e-9f08-419c-bd05-a157ad336748 req-3305ea7e-1df4-4b37-8101-8d0eeac0e758 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.634 186180 DEBUG oslo_concurrency.lockutils [req-b9d7f07e-9f08-419c-bd05-a157ad336748 req-3305ea7e-1df4-4b37-8101-8d0eeac0e758 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.634 186180 DEBUG nova.compute.manager [req-b9d7f07e-9f08-419c-bd05-a157ad336748 req-3305ea7e-1df4-4b37-8101-8d0eeac0e758 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Processing event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.672 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[62835e1f-aaac-4919-84cc-e3e762c1d16f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.674 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.674 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.675 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap94cafcd0-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:46:22 compute-0 NetworkManager[56463]: <info>  [1771263982.7177] manager: (tap94cafcd0-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Feb 16 17:46:22 compute-0 kernel: tap94cafcd0-c0: entered promiscuous mode
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.717 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.722 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap94cafcd0-c0, col_values=(('external_ids', {'iface-id': '5c28d585-b48c-40c6-b5e7-f1e59317b2de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.724 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:22 compute-0 ovn_controller[96437]: 2026-02-16T17:46:22Z|00186|binding|INFO|Releasing lport 5c28d585-b48c-40c6-b5e7-f1e59317b2de from this chassis (sb_readonly=0)
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.726 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.729 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.727 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[49cfb194-f144-4945-a351-49cc4e4d482b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.731 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.pid.haproxy
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:46:22 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:22.732 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'env', 'PROCESS_TAG=haproxy-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.736 186180 DEBUG nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.737 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263982.7353377, 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.737 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] VM Started (Lifecycle Event)
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.740 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.744 186180 INFO nova.virt.libvirt.driver [-] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Instance spawned successfully.
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.744 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.762 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.768 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.772 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.773 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.773 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.774 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.774 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.775 186180 DEBUG nova.virt.libvirt.driver [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.802 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.803 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263982.7367399, 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.803 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] VM Paused (Lifecycle Event)
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.841 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.847 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771263982.7399836, 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.847 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] VM Resumed (Lifecycle Event)
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.854 186180 INFO nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Took 7.61 seconds to spawn the instance on the hypervisor.
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.855 186180 DEBUG nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.865 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.867 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.897 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.925 186180 INFO nova.compute.manager [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Took 8.07 seconds to build instance.
Feb 16 17:46:22 compute-0 nova_compute[186176]: 2026-02-16 17:46:22.948 186180 DEBUG oslo_concurrency.lockutils [None req-47246785-9670-4e16-b802-9959cf261617 c54934f49b2044289bcf127662fe114b 1a237c4b00c5426cb1dc6afe3c7c868c - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:23 compute-0 podman[214493]: 2026-02-16 17:46:23.065837994 +0000 UTC m=+0.057492831 container create 301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 16 17:46:23 compute-0 systemd[1]: Started libpod-conmon-301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8.scope.
Feb 16 17:46:23 compute-0 podman[214493]: 2026-02-16 17:46:23.035204223 +0000 UTC m=+0.026859150 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:46:23 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:46:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/655bcee3733647115449bd87b7c26cf9787d3c8ea84428ee6b3694f78baeb01c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:46:23 compute-0 podman[214493]: 2026-02-16 17:46:23.15748391 +0000 UTC m=+0.149138787 container init 301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 16 17:46:23 compute-0 podman[214493]: 2026-02-16 17:46:23.162218796 +0000 UTC m=+0.153873663 container start 301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 16 17:46:23 compute-0 nova_compute[186176]: 2026-02-16 17:46:23.187 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:23.188 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:46:23 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[214509]: [NOTICE]   (214513) : New worker (214515) forked
Feb 16 17:46:23 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[214509]: [NOTICE]   (214513) : Loading success.
Feb 16 17:46:23 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:23.226 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:46:23 compute-0 nova_compute[186176]: 2026-02-16 17:46:23.270 186180 DEBUG nova.network.neutron [req-ff024a37-0b6b-43af-b8d2-ecc6cc165f1e req-e7e466dd-946a-4865-b0e5-66df677b1547 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Updated VIF entry in instance network info cache for port 694c06a2-2c68-418e-91aa-8541c7793afa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:46:23 compute-0 nova_compute[186176]: 2026-02-16 17:46:23.270 186180 DEBUG nova.network.neutron [req-ff024a37-0b6b-43af-b8d2-ecc6cc165f1e req-e7e466dd-946a-4865-b0e5-66df677b1547 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Updating instance_info_cache with network_info: [{"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:46:23 compute-0 nova_compute[186176]: 2026-02-16 17:46:23.286 186180 DEBUG oslo_concurrency.lockutils [req-ff024a37-0b6b-43af-b8d2-ecc6cc165f1e req-e7e466dd-946a-4865-b0e5-66df677b1547 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:46:24 compute-0 nova_compute[186176]: 2026-02-16 17:46:24.699 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:24 compute-0 nova_compute[186176]: 2026-02-16 17:46:24.728 186180 DEBUG nova.compute.manager [req-e754d114-d8ff-4f05-a60e-00e2fbacca96 req-ac929d2c-3ee1-4c5e-ac0f-2dcbc9ddfdd8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:46:24 compute-0 nova_compute[186176]: 2026-02-16 17:46:24.729 186180 DEBUG oslo_concurrency.lockutils [req-e754d114-d8ff-4f05-a60e-00e2fbacca96 req-ac929d2c-3ee1-4c5e-ac0f-2dcbc9ddfdd8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:24 compute-0 nova_compute[186176]: 2026-02-16 17:46:24.729 186180 DEBUG oslo_concurrency.lockutils [req-e754d114-d8ff-4f05-a60e-00e2fbacca96 req-ac929d2c-3ee1-4c5e-ac0f-2dcbc9ddfdd8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:24 compute-0 nova_compute[186176]: 2026-02-16 17:46:24.729 186180 DEBUG oslo_concurrency.lockutils [req-e754d114-d8ff-4f05-a60e-00e2fbacca96 req-ac929d2c-3ee1-4c5e-ac0f-2dcbc9ddfdd8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:24 compute-0 nova_compute[186176]: 2026-02-16 17:46:24.729 186180 DEBUG nova.compute.manager [req-e754d114-d8ff-4f05-a60e-00e2fbacca96 req-ac929d2c-3ee1-4c5e-ac0f-2dcbc9ddfdd8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] No waiting events found dispatching network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:46:24 compute-0 nova_compute[186176]: 2026-02-16 17:46:24.730 186180 WARNING nova.compute.manager [req-e754d114-d8ff-4f05-a60e-00e2fbacca96 req-ac929d2c-3ee1-4c5e-ac0f-2dcbc9ddfdd8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received unexpected event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa for instance with vm_state active and task_state None.
Feb 16 17:46:26 compute-0 nova_compute[186176]: 2026-02-16 17:46:26.458 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:27 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:27.228 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:46:29 compute-0 nova_compute[186176]: 2026-02-16 17:46:29.701 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:29 compute-0 podman[195505]: time="2026-02-16T17:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:46:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:46:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2640 "" "Go-http-client/1.1"
Feb 16 17:46:31 compute-0 openstack_network_exporter[198360]: ERROR   17:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:46:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:46:31 compute-0 openstack_network_exporter[198360]: ERROR   17:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:46:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:46:31 compute-0 nova_compute[186176]: 2026-02-16 17:46:31.462 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:32 compute-0 podman[214524]: 2026-02-16 17:46:32.141561089 +0000 UTC m=+0.101042868 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, io.buildah.version=1.33.7)
Feb 16 17:46:34 compute-0 nova_compute[186176]: 2026-02-16 17:46:34.703 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:35 compute-0 podman[214561]: 2026-02-16 17:46:35.089571757 +0000 UTC m=+0.054022795 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:46:35 compute-0 ovn_controller[96437]: 2026-02-16T17:46:35Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:33:54:43 10.100.0.3
Feb 16 17:46:35 compute-0 ovn_controller[96437]: 2026-02-16T17:46:35Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:33:54:43 10.100.0.3
Feb 16 17:46:36 compute-0 nova_compute[186176]: 2026-02-16 17:46:36.465 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:38.179 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:38.180 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:46:38.181 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:39 compute-0 nova_compute[186176]: 2026-02-16 17:46:39.705 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:41 compute-0 podman[214581]: 2026-02-16 17:46:41.095934189 +0000 UTC m=+0.053595424 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:46:41 compute-0 podman[214580]: 2026-02-16 17:46:41.12287249 +0000 UTC m=+0.091600007 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Feb 16 17:46:41 compute-0 nova_compute[186176]: 2026-02-16 17:46:41.467 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:43 compute-0 nova_compute[186176]: 2026-02-16 17:46:43.252 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:44 compute-0 nova_compute[186176]: 2026-02-16 17:46:44.707 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:46 compute-0 nova_compute[186176]: 2026-02-16 17:46:46.471 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:49 compute-0 nova_compute[186176]: 2026-02-16 17:46:49.710 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:51 compute-0 nova_compute[186176]: 2026-02-16 17:46:51.349 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:51 compute-0 nova_compute[186176]: 2026-02-16 17:46:51.349 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:46:51 compute-0 nova_compute[186176]: 2026-02-16 17:46:51.350 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:46:51 compute-0 nova_compute[186176]: 2026-02-16 17:46:51.475 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:52 compute-0 nova_compute[186176]: 2026-02-16 17:46:52.084 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:46:52 compute-0 nova_compute[186176]: 2026-02-16 17:46:52.085 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:46:52 compute-0 nova_compute[186176]: 2026-02-16 17:46:52.085 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:46:52 compute-0 nova_compute[186176]: 2026-02-16 17:46:52.085 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.115 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Updating instance_info_cache with network_info: [{"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.140 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.141 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.141 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.142 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.159 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Triggering sync for uuid 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.160 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.161 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.161 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.186 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.187 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.187 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.188 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.196 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.265 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.352 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.353 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.423 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.627 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.629 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5638MB free_disk=73.19369506835938GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.629 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.629 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.711 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.722 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.723 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.723 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.808 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.827 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.858 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.858 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.859 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.859 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 17:46:54 compute-0 nova_compute[186176]: 2026-02-16 17:46:54.874 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 17:46:55 compute-0 nova_compute[186176]: 2026-02-16 17:46:55.029 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:55 compute-0 nova_compute[186176]: 2026-02-16 17:46:55.153 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:55 compute-0 nova_compute[186176]: 2026-02-16 17:46:55.153 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:46:55 compute-0 nova_compute[186176]: 2026-02-16 17:46:55.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:56 compute-0 nova_compute[186176]: 2026-02-16 17:46:56.311 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:56 compute-0 nova_compute[186176]: 2026-02-16 17:46:56.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:46:56 compute-0 nova_compute[186176]: 2026-02-16 17:46:56.479 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:59 compute-0 nova_compute[186176]: 2026-02-16 17:46:59.714 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:46:59 compute-0 podman[195505]: time="2026-02-16T17:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:46:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:46:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Feb 16 17:47:00 compute-0 nova_compute[186176]: 2026-02-16 17:47:00.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:00 compute-0 nova_compute[186176]: 2026-02-16 17:47:00.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:00 compute-0 nova_compute[186176]: 2026-02-16 17:47:00.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 17:47:01 compute-0 openstack_network_exporter[198360]: ERROR   17:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:47:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:47:01 compute-0 openstack_network_exporter[198360]: ERROR   17:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:47:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:47:01 compute-0 nova_compute[186176]: 2026-02-16 17:47:01.481 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:02 compute-0 ovn_controller[96437]: 2026-02-16T17:47:02Z|00187|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 17:47:02 compute-0 nova_compute[186176]: 2026-02-16 17:47:02.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:03 compute-0 podman[214639]: 2026-02-16 17:47:03.071899957 +0000 UTC m=+0.045816174 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 17:47:04 compute-0 nova_compute[186176]: 2026-02-16 17:47:04.716 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:06 compute-0 podman[214660]: 2026-02-16 17:47:06.066854627 +0000 UTC m=+0.042647757 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 17:47:06 compute-0 nova_compute[186176]: 2026-02-16 17:47:06.484 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:07 compute-0 nova_compute[186176]: 2026-02-16 17:47:07.348 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:09 compute-0 nova_compute[186176]: 2026-02-16 17:47:09.718 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:11 compute-0 nova_compute[186176]: 2026-02-16 17:47:11.488 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:12 compute-0 podman[214682]: 2026-02-16 17:47:12.08588278 +0000 UTC m=+0.049165407 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:47:12 compute-0 podman[214681]: 2026-02-16 17:47:12.176951312 +0000 UTC m=+0.140788302 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 17:47:14 compute-0 nova_compute[186176]: 2026-02-16 17:47:14.761 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:16 compute-0 nova_compute[186176]: 2026-02-16 17:47:16.491 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:19 compute-0 nova_compute[186176]: 2026-02-16 17:47:19.763 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:21 compute-0 nova_compute[186176]: 2026-02-16 17:47:21.495 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:22 compute-0 nova_compute[186176]: 2026-02-16 17:47:22.573 186180 DEBUG nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Check if temp file /var/lib/nova/instances/tmpx27cnt8t exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 17:47:22 compute-0 nova_compute[186176]: 2026-02-16 17:47:22.574 186180 DEBUG nova.compute.manager [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx27cnt8t',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b97d70c-fca4-4b8e-8381-e32928f2a1f0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 17:47:23 compute-0 nova_compute[186176]: 2026-02-16 17:47:23.266 186180 DEBUG oslo_concurrency.processutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:47:23 compute-0 nova_compute[186176]: 2026-02-16 17:47:23.324 186180 DEBUG oslo_concurrency.processutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:47:23 compute-0 nova_compute[186176]: 2026-02-16 17:47:23.326 186180 DEBUG oslo_concurrency.processutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:47:23 compute-0 nova_compute[186176]: 2026-02-16 17:47:23.411 186180 DEBUG oslo_concurrency.processutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:47:24 compute-0 nova_compute[186176]: 2026-02-16 17:47:24.765 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:24 compute-0 sshd-session[214741]: Accepted publickey for nova from 192.168.122.101 port 32862 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:47:24 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 17:47:24 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 17:47:24 compute-0 systemd-logind[821]: New session 41 of user nova.
Feb 16 17:47:24 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 17:47:24 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 17:47:24 compute-0 systemd[214745]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:47:25 compute-0 systemd[214745]: Queued start job for default target Main User Target.
Feb 16 17:47:25 compute-0 systemd[214745]: Created slice User Application Slice.
Feb 16 17:47:25 compute-0 systemd[214745]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:47:25 compute-0 systemd[214745]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 17:47:25 compute-0 systemd[214745]: Reached target Paths.
Feb 16 17:47:25 compute-0 systemd[214745]: Reached target Timers.
Feb 16 17:47:25 compute-0 systemd[214745]: Starting D-Bus User Message Bus Socket...
Feb 16 17:47:25 compute-0 systemd[214745]: Starting Create User's Volatile Files and Directories...
Feb 16 17:47:25 compute-0 systemd[214745]: Finished Create User's Volatile Files and Directories.
Feb 16 17:47:25 compute-0 systemd[214745]: Listening on D-Bus User Message Bus Socket.
Feb 16 17:47:25 compute-0 systemd[214745]: Reached target Sockets.
Feb 16 17:47:25 compute-0 systemd[214745]: Reached target Basic System.
Feb 16 17:47:25 compute-0 systemd[214745]: Reached target Main User Target.
Feb 16 17:47:25 compute-0 systemd[214745]: Startup finished in 132ms.
Feb 16 17:47:25 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 17:47:25 compute-0 systemd[1]: Started Session 41 of User nova.
Feb 16 17:47:25 compute-0 sshd-session[214741]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:47:25 compute-0 sshd-session[214760]: Received disconnect from 192.168.122.101 port 32862:11: disconnected by user
Feb 16 17:47:25 compute-0 sshd-session[214760]: Disconnected from user nova 192.168.122.101 port 32862
Feb 16 17:47:25 compute-0 sshd-session[214741]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:47:25 compute-0 systemd[1]: session-41.scope: Deactivated successfully.
Feb 16 17:47:25 compute-0 systemd-logind[821]: Session 41 logged out. Waiting for processes to exit.
Feb 16 17:47:25 compute-0 systemd-logind[821]: Removed session 41.
Feb 16 17:47:26 compute-0 nova_compute[186176]: 2026-02-16 17:47:26.498 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:27 compute-0 nova_compute[186176]: 2026-02-16 17:47:27.253 186180 DEBUG nova.compute.manager [req-410e9b59-ae37-4ca1-8a4f-0c33ccfe44ca req-decb950c-8622-4d8d-a475-4f1ba27d1584 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-unplugged-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:47:27 compute-0 nova_compute[186176]: 2026-02-16 17:47:27.254 186180 DEBUG oslo_concurrency.lockutils [req-410e9b59-ae37-4ca1-8a4f-0c33ccfe44ca req-decb950c-8622-4d8d-a475-4f1ba27d1584 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:27 compute-0 nova_compute[186176]: 2026-02-16 17:47:27.255 186180 DEBUG oslo_concurrency.lockutils [req-410e9b59-ae37-4ca1-8a4f-0c33ccfe44ca req-decb950c-8622-4d8d-a475-4f1ba27d1584 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:27 compute-0 nova_compute[186176]: 2026-02-16 17:47:27.255 186180 DEBUG oslo_concurrency.lockutils [req-410e9b59-ae37-4ca1-8a4f-0c33ccfe44ca req-decb950c-8622-4d8d-a475-4f1ba27d1584 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:27 compute-0 nova_compute[186176]: 2026-02-16 17:47:27.256 186180 DEBUG nova.compute.manager [req-410e9b59-ae37-4ca1-8a4f-0c33ccfe44ca req-decb950c-8622-4d8d-a475-4f1ba27d1584 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] No waiting events found dispatching network-vif-unplugged-694c06a2-2c68-418e-91aa-8541c7793afa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:47:27 compute-0 nova_compute[186176]: 2026-02-16 17:47:27.256 186180 DEBUG nova.compute.manager [req-410e9b59-ae37-4ca1-8a4f-0c33ccfe44ca req-decb950c-8622-4d8d-a475-4f1ba27d1584 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-unplugged-694c06a2-2c68-418e-91aa-8541c7793afa for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.167 186180 INFO nova.compute.manager [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Took 5.76 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.168 186180 DEBUG nova.compute.manager [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.181 186180 DEBUG nova.compute.manager [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpx27cnt8t',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6b97d70c-fca4-4b8e-8381-e32928f2a1f0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(22a7b629-663e-49c2-ba7d-d8b941ae1bff),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.200 186180 DEBUG nova.objects.instance [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.202 186180 DEBUG nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.205 186180 DEBUG nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.206 186180 DEBUG nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.219 186180 DEBUG nova.virt.libvirt.vif [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1989966983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1989966983',id=23,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:46:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-9gbbdgo6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:46:22Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=6b97d70c-fca4-4b8e-8381-e32928f2a1f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.220 186180 DEBUG nova.network.os_vif_util [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.221 186180 DEBUG nova.network.os_vif_util [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:54:43,bridge_name='br-int',has_traffic_filtering=True,id=694c06a2-2c68-418e-91aa-8541c7793afa,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap694c06a2-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.222 186180 DEBUG nova.virt.libvirt.migration [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 17:47:29 compute-0 nova_compute[186176]:   <mac address="fa:16:3e:33:54:43"/>
Feb 16 17:47:29 compute-0 nova_compute[186176]:   <model type="virtio"/>
Feb 16 17:47:29 compute-0 nova_compute[186176]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:47:29 compute-0 nova_compute[186176]:   <mtu size="1442"/>
Feb 16 17:47:29 compute-0 nova_compute[186176]:   <target dev="tap694c06a2-2c"/>
Feb 16 17:47:29 compute-0 nova_compute[186176]: </interface>
Feb 16 17:47:29 compute-0 nova_compute[186176]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.223 186180 DEBUG nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.325 186180 DEBUG nova.compute.manager [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.325 186180 DEBUG oslo_concurrency.lockutils [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.326 186180 DEBUG oslo_concurrency.lockutils [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.326 186180 DEBUG oslo_concurrency.lockutils [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.327 186180 DEBUG nova.compute.manager [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] No waiting events found dispatching network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.327 186180 WARNING nova.compute.manager [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received unexpected event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa for instance with vm_state active and task_state migrating.
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.328 186180 DEBUG nova.compute.manager [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-changed-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.328 186180 DEBUG nova.compute.manager [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Refreshing instance network info cache due to event network-changed-694c06a2-2c68-418e-91aa-8541c7793afa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.328 186180 DEBUG oslo_concurrency.lockutils [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.329 186180 DEBUG oslo_concurrency.lockutils [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.329 186180 DEBUG nova.network.neutron [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Refreshing network info cache for port 694c06a2-2c68-418e-91aa-8541c7793afa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.708 186180 DEBUG nova.virt.libvirt.migration [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.708 186180 INFO nova.virt.libvirt.migration [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 17:47:29 compute-0 podman[195505]: time="2026-02-16T17:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:47:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:47:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2642 "" "Go-http-client/1.1"
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.768 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:29 compute-0 nova_compute[186176]: 2026-02-16 17:47:29.804 186180 INFO nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 17:47:30 compute-0 nova_compute[186176]: 2026-02-16 17:47:30.307 186180 DEBUG nova.virt.libvirt.migration [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:47:30 compute-0 nova_compute[186176]: 2026-02-16 17:47:30.307 186180 DEBUG nova.virt.libvirt.migration [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:47:30 compute-0 nova_compute[186176]: 2026-02-16 17:47:30.811 186180 DEBUG nova.virt.libvirt.migration [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:47:30 compute-0 nova_compute[186176]: 2026-02-16 17:47:30.812 186180 DEBUG nova.virt.libvirt.migration [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:47:30 compute-0 nova_compute[186176]: 2026-02-16 17:47:30.901 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264050.900843, 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:47:30 compute-0 nova_compute[186176]: 2026-02-16 17:47:30.902 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] VM Paused (Lifecycle Event)
Feb 16 17:47:30 compute-0 nova_compute[186176]: 2026-02-16 17:47:30.920 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:47:30 compute-0 nova_compute[186176]: 2026-02-16 17:47:30.925 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:47:30 compute-0 nova_compute[186176]: 2026-02-16 17:47:30.959 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 16 17:47:31 compute-0 kernel: tap694c06a2-2c (unregistering): left promiscuous mode
Feb 16 17:47:31 compute-0 NetworkManager[56463]: <info>  [1771264051.0542] device (tap694c06a2-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:47:31 compute-0 ovn_controller[96437]: 2026-02-16T17:47:31Z|00188|binding|INFO|Releasing lport 694c06a2-2c68-418e-91aa-8541c7793afa from this chassis (sb_readonly=0)
Feb 16 17:47:31 compute-0 ovn_controller[96437]: 2026-02-16T17:47:31Z|00189|binding|INFO|Setting lport 694c06a2-2c68-418e-91aa-8541c7793afa down in Southbound
Feb 16 17:47:31 compute-0 ovn_controller[96437]: 2026-02-16T17:47:31Z|00190|binding|INFO|Removing iface tap694c06a2-2c ovn-installed in OVS
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.096 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.102 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.102 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:54:43 10.100.0.3'], port_security=['fa:16:3e:33:54:43 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6b97d70c-fca4-4b8e-8381-e32928f2a1f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=694c06a2-2c68-418e-91aa-8541c7793afa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.106 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 694c06a2-2c68-418e-91aa-8541c7793afa in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.108 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.112 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3cbfd1a3-2ab4-4c94-aec7-5c1a975692bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.114 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 namespace which is not needed anymore
Feb 16 17:47:31 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000017.scope: Deactivated successfully.
Feb 16 17:47:31 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000017.scope: Consumed 14.377s CPU time.
Feb 16 17:47:31 compute-0 systemd-machined[155631]: Machine qemu-18-instance-00000017 terminated.
Feb 16 17:47:31 compute-0 systemd-udevd[214776]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:47:31 compute-0 kernel: tap694c06a2-2c: entered promiscuous mode
Feb 16 17:47:31 compute-0 kernel: tap694c06a2-2c (unregistering): left promiscuous mode
Feb 16 17:47:31 compute-0 NetworkManager[56463]: <info>  [1771264051.2597] manager: (tap694c06a2-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.266 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:31 compute-0 ovn_controller[96437]: 2026-02-16T17:47:31Z|00191|binding|INFO|Claiming lport 694c06a2-2c68-418e-91aa-8541c7793afa for this chassis.
Feb 16 17:47:31 compute-0 ovn_controller[96437]: 2026-02-16T17:47:31Z|00192|binding|INFO|694c06a2-2c68-418e-91aa-8541c7793afa: Claiming fa:16:3e:33:54:43 10.100.0.3
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.274 186180 DEBUG nova.compute.manager [req-39110189-93b3-4369-88cf-eeae2510be6a req-554b7352-1f88-443d-a64c-e24831f43414 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-unplugged-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.275 186180 DEBUG oslo_concurrency.lockutils [req-39110189-93b3-4369-88cf-eeae2510be6a req-554b7352-1f88-443d-a64c-e24831f43414 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.275 186180 DEBUG oslo_concurrency.lockutils [req-39110189-93b3-4369-88cf-eeae2510be6a req-554b7352-1f88-443d-a64c-e24831f43414 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.275 186180 DEBUG oslo_concurrency.lockutils [req-39110189-93b3-4369-88cf-eeae2510be6a req-554b7352-1f88-443d-a64c-e24831f43414 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.275 186180 DEBUG nova.compute.manager [req-39110189-93b3-4369-88cf-eeae2510be6a req-554b7352-1f88-443d-a64c-e24831f43414 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] No waiting events found dispatching network-vif-unplugged-694c06a2-2c68-418e-91aa-8541c7793afa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.276 186180 DEBUG nova.compute.manager [req-39110189-93b3-4369-88cf-eeae2510be6a req-554b7352-1f88-443d-a64c-e24831f43414 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-unplugged-694c06a2-2c68-418e-91aa-8541c7793afa for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.281 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:54:43 10.100.0.3'], port_security=['fa:16:3e:33:54:43 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6b97d70c-fca4-4b8e-8381-e32928f2a1f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=694c06a2-2c68-418e-91aa-8541c7793afa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:47:31 compute-0 ovn_controller[96437]: 2026-02-16T17:47:31Z|00193|binding|INFO|Releasing lport 694c06a2-2c68-418e-91aa-8541c7793afa from this chassis (sb_readonly=0)
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.284 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:31 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[214509]: [NOTICE]   (214513) : haproxy version is 2.8.14-c23fe91
Feb 16 17:47:31 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[214509]: [NOTICE]   (214513) : path to executable is /usr/sbin/haproxy
Feb 16 17:47:31 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[214509]: [WARNING]  (214513) : Exiting Master process...
Feb 16 17:47:31 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[214509]: [ALERT]    (214513) : Current worker (214515) exited with code 143 (Terminated)
Feb 16 17:47:31 compute-0 neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4[214509]: [WARNING]  (214513) : All workers exited. Exiting... (0)
Feb 16 17:47:31 compute-0 systemd[1]: libpod-301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8.scope: Deactivated successfully.
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.291 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:33:54:43 10.100.0.3'], port_security=['fa:16:3e:33:54:43 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6b97d70c-fca4-4b8e-8381-e32928f2a1f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a237c4b00c5426cb1dc6afe3c7c868c', 'neutron:revision_number': '8', 'neutron:security_group_ids': '27048986-78c3-40df-bfe8-df04a7b418f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2bc497-d54f-4791-8004-249e87375ec0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=694c06a2-2c68-418e-91aa-8541c7793afa) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:47:31 compute-0 podman[214797]: 2026-02-16 17:47:31.295710277 +0000 UTC m=+0.068119221 container died 301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.309 186180 DEBUG nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.309 186180 DEBUG nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.309 186180 DEBUG nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.315 186180 DEBUG nova.virt.libvirt.guest [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '6b97d70c-fca4-4b8e-8381-e32928f2a1f0' (instance-00000017) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.316 186180 INFO nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Migration operation has completed
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.316 186180 INFO nova.compute.manager [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] _post_live_migration() is started..
Feb 16 17:47:31 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8-userdata-shm.mount: Deactivated successfully.
Feb 16 17:47:31 compute-0 systemd[1]: var-lib-containers-storage-overlay-655bcee3733647115449bd87b7c26cf9787d3c8ea84428ee6b3694f78baeb01c-merged.mount: Deactivated successfully.
Feb 16 17:47:31 compute-0 podman[214797]: 2026-02-16 17:47:31.332454528 +0000 UTC m=+0.104863462 container cleanup 301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 17:47:31 compute-0 systemd[1]: libpod-conmon-301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8.scope: Deactivated successfully.
Feb 16 17:47:31 compute-0 podman[214835]: 2026-02-16 17:47:31.387141498 +0000 UTC m=+0.037015818 container remove 301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.393 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[13029386-4e82-4d89-aa50-e8378e1b850b]: (4, ('Mon Feb 16 05:47:31 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8)\n301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8\nMon Feb 16 05:47:31 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 (301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8)\n301094383ba19f14c96b34009b1c400527e6a39c3195e1618b517dcf9d6954c8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.395 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[fc97ff74-0aea-46a4-a6b3-46850736e752]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.396 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap94cafcd0-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:47:31 compute-0 kernel: tap94cafcd0-c0: left promiscuous mode
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.399 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.407 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.411 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[2a96c00d-36c0-4975-aeb8-5d895942b5cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:47:31 compute-0 openstack_network_exporter[198360]: ERROR   17:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:47:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:47:31 compute-0 openstack_network_exporter[198360]: ERROR   17:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:47:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.425 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[40f473ba-acd0-45b6-94c8-b53f5ecc7da8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.428 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ddc6a7-edd4-4c37-b9d4-88c83e8717b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.443 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[dda73d18-5235-4731-9764-1495e2637469]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 557798, 'reachable_time': 26777, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214857, 'error': None, 'target': 'ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:47:31 compute-0 systemd[1]: run-netns-ovnmeta\x2d94cafcd0\x2dc7c2\x2d48b4\x2da2dd\x2d21c16ce48dc4.mount: Deactivated successfully.
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.447 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.448 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[9e499ac7-b398-4564-9366-2d4422e24599]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.449 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 694c06a2-2c68-418e-91aa-8541c7793afa in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.451 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.452 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[17c92715-4c63-49b3-aa17-8f9549fbe060]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.453 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 694c06a2-2c68-418e-91aa-8541c7793afa in datapath 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4 unbound from our chassis
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.455 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:47:31 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:31.456 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[76ff2101-12b3-468d-a6bd-3399cdaa8728]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:47:31 compute-0 nova_compute[186176]: 2026-02-16 17:47:31.502 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:32 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:32.192 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:47:32 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:32.194 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:47:32 compute-0 nova_compute[186176]: 2026-02-16 17:47:32.218 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:32 compute-0 nova_compute[186176]: 2026-02-16 17:47:32.498 186180 DEBUG nova.compute.manager [req-45113e0b-0360-4a01-9d86-b5523c7f6818 req-cef97154-47fa-4111-a9cc-d5dc89b58f18 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-unplugged-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:47:32 compute-0 nova_compute[186176]: 2026-02-16 17:47:32.498 186180 DEBUG oslo_concurrency.lockutils [req-45113e0b-0360-4a01-9d86-b5523c7f6818 req-cef97154-47fa-4111-a9cc-d5dc89b58f18 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:32 compute-0 nova_compute[186176]: 2026-02-16 17:47:32.499 186180 DEBUG oslo_concurrency.lockutils [req-45113e0b-0360-4a01-9d86-b5523c7f6818 req-cef97154-47fa-4111-a9cc-d5dc89b58f18 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:32 compute-0 nova_compute[186176]: 2026-02-16 17:47:32.499 186180 DEBUG oslo_concurrency.lockutils [req-45113e0b-0360-4a01-9d86-b5523c7f6818 req-cef97154-47fa-4111-a9cc-d5dc89b58f18 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:32 compute-0 nova_compute[186176]: 2026-02-16 17:47:32.499 186180 DEBUG nova.compute.manager [req-45113e0b-0360-4a01-9d86-b5523c7f6818 req-cef97154-47fa-4111-a9cc-d5dc89b58f18 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] No waiting events found dispatching network-vif-unplugged-694c06a2-2c68-418e-91aa-8541c7793afa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:47:32 compute-0 nova_compute[186176]: 2026-02-16 17:47:32.500 186180 DEBUG nova.compute.manager [req-45113e0b-0360-4a01-9d86-b5523c7f6818 req-cef97154-47fa-4111-a9cc-d5dc89b58f18 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-unplugged-694c06a2-2c68-418e-91aa-8541c7793afa for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.118 186180 DEBUG nova.network.neutron [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Updated VIF entry in instance network info cache for port 694c06a2-2c68-418e-91aa-8541c7793afa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.118 186180 DEBUG nova.network.neutron [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Updating instance_info_cache with network_info: [{"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.145 186180 DEBUG oslo_concurrency.lockutils [req-333e4963-448b-47ca-9764-332b191995dc req-0d995a44-684f-40a6-b43f-17c7b6db266e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-6b97d70c-fca4-4b8e-8381-e32928f2a1f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.214 186180 DEBUG nova.network.neutron [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Activated binding for port 694c06a2-2c68-418e-91aa-8541c7793afa and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.215 186180 DEBUG nova.compute.manager [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.217 186180 DEBUG nova.virt.libvirt.vif [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:46:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1989966983',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1989966983',id=23,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:46:22Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1a237c4b00c5426cb1dc6afe3c7c868c',ramdisk_id='',reservation_id='r-9gbbdgo6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1098930400',owner_user_name='tempest-TestExecuteStrategies-1098930400-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:47:20Z,user_data=None,user_id='c54934f49b2044289bcf127662fe114b',uuid=6b97d70c-fca4-4b8e-8381-e32928f2a1f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.218 186180 DEBUG nova.network.os_vif_util [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "694c06a2-2c68-418e-91aa-8541c7793afa", "address": "fa:16:3e:33:54:43", "network": {"id": "94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-598038299-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a237c4b00c5426cb1dc6afe3c7c868c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap694c06a2-2c", "ovs_interfaceid": "694c06a2-2c68-418e-91aa-8541c7793afa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.219 186180 DEBUG nova.network.os_vif_util [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:33:54:43,bridge_name='br-int',has_traffic_filtering=True,id=694c06a2-2c68-418e-91aa-8541c7793afa,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap694c06a2-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.220 186180 DEBUG os_vif [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:54:43,bridge_name='br-int',has_traffic_filtering=True,id=694c06a2-2c68-418e-91aa-8541c7793afa,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap694c06a2-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.223 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.224 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap694c06a2-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.230 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.235 186180 INFO os_vif [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:33:54:43,bridge_name='br-int',has_traffic_filtering=True,id=694c06a2-2c68-418e-91aa-8541c7793afa,network=Network(94cafcd0-c7c2-48b4-a2dd-21c16ce48dc4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap694c06a2-2c')
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.236 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.237 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.238 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.238 186180 DEBUG nova.compute.manager [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.239 186180 INFO nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Deleting instance files /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0_del
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.241 186180 INFO nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Deletion of /var/lib/nova/instances/6b97d70c-fca4-4b8e-8381-e32928f2a1f0_del complete
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.355 186180 DEBUG nova.compute.manager [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.356 186180 DEBUG oslo_concurrency.lockutils [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.357 186180 DEBUG oslo_concurrency.lockutils [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.357 186180 DEBUG oslo_concurrency.lockutils [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.358 186180 DEBUG nova.compute.manager [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] No waiting events found dispatching network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.358 186180 WARNING nova.compute.manager [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received unexpected event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa for instance with vm_state active and task_state migrating.
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.359 186180 DEBUG nova.compute.manager [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.360 186180 DEBUG oslo_concurrency.lockutils [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.360 186180 DEBUG oslo_concurrency.lockutils [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.361 186180 DEBUG oslo_concurrency.lockutils [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.361 186180 DEBUG nova.compute.manager [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] No waiting events found dispatching network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:47:33 compute-0 nova_compute[186176]: 2026-02-16 17:47:33.362 186180 WARNING nova.compute.manager [req-0663fba1-faac-4d5b-a5c6-81d99956a632 req-5685aed7-8c7c-4258-9459-929487da39d6 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received unexpected event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa for instance with vm_state active and task_state migrating.
Feb 16 17:47:34 compute-0 podman[214858]: 2026-02-16 17:47:34.109793083 +0000 UTC m=+0.073023451 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9)
Feb 16 17:47:34 compute-0 nova_compute[186176]: 2026-02-16 17:47:34.770 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:35 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 17:47:35 compute-0 systemd[214745]: Activating special unit Exit the Session...
Feb 16 17:47:35 compute-0 systemd[214745]: Stopped target Main User Target.
Feb 16 17:47:35 compute-0 systemd[214745]: Stopped target Basic System.
Feb 16 17:47:35 compute-0 systemd[214745]: Stopped target Paths.
Feb 16 17:47:35 compute-0 systemd[214745]: Stopped target Sockets.
Feb 16 17:47:35 compute-0 systemd[214745]: Stopped target Timers.
Feb 16 17:47:35 compute-0 systemd[214745]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:47:35 compute-0 systemd[214745]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 17:47:35 compute-0 systemd[214745]: Closed D-Bus User Message Bus Socket.
Feb 16 17:47:35 compute-0 systemd[214745]: Stopped Create User's Volatile Files and Directories.
Feb 16 17:47:35 compute-0 systemd[214745]: Removed slice User Application Slice.
Feb 16 17:47:35 compute-0 systemd[214745]: Reached target Shutdown.
Feb 16 17:47:35 compute-0 systemd[214745]: Finished Exit the Session.
Feb 16 17:47:35 compute-0 systemd[214745]: Reached target Exit the Session.
Feb 16 17:47:35 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 17:47:35 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 17:47:35 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 17:47:35 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 17:47:35 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 17:47:35 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 17:47:35 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.436 186180 DEBUG nova.compute.manager [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.437 186180 DEBUG oslo_concurrency.lockutils [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.437 186180 DEBUG oslo_concurrency.lockutils [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.437 186180 DEBUG oslo_concurrency.lockutils [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.438 186180 DEBUG nova.compute.manager [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] No waiting events found dispatching network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.438 186180 WARNING nova.compute.manager [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received unexpected event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa for instance with vm_state active and task_state migrating.
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.438 186180 DEBUG nova.compute.manager [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.439 186180 DEBUG oslo_concurrency.lockutils [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.439 186180 DEBUG oslo_concurrency.lockutils [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.439 186180 DEBUG oslo_concurrency.lockutils [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.440 186180 DEBUG nova.compute.manager [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] No waiting events found dispatching network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:47:35 compute-0 nova_compute[186176]: 2026-02-16 17:47:35.440 186180 WARNING nova.compute.manager [req-9c962601-7bbc-4651-a8a3-79ce9e087d95 req-b6bb2b99-3da7-4b63-aa07-54566f3400a2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Received unexpected event network-vif-plugged-694c06a2-2c68-418e-91aa-8541c7793afa for instance with vm_state active and task_state migrating.
Feb 16 17:47:37 compute-0 podman[214883]: 2026-02-16 17:47:37.11119145 +0000 UTC m=+0.074459617 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 17:47:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:38.180 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:38.181 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:38.181 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:38 compute-0 nova_compute[186176]: 2026-02-16 17:47:38.228 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.288 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.289 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.289 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "6b97d70c-fca4-4b8e-8381-e32928f2a1f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.312 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.313 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.313 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.313 186180 DEBUG nova.compute.resource_tracker [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.501 186180 WARNING nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.503 186180 DEBUG nova.compute.resource_tracker [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5784MB free_disk=73.22297668457031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.503 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.504 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.545 186180 DEBUG nova.compute.resource_tracker [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Migration for instance 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.565 186180 DEBUG nova.compute.resource_tracker [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.592 186180 DEBUG nova.compute.resource_tracker [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Migration 22a7b629-663e-49c2-ba7d-d8b941ae1bff is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.593 186180 DEBUG nova.compute.resource_tracker [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.594 186180 DEBUG nova.compute.resource_tracker [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.700 186180 DEBUG nova.compute.provider_tree [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.717 186180 DEBUG nova.scheduler.client.report [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.739 186180 DEBUG nova.compute.resource_tracker [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.739 186180 DEBUG oslo_concurrency.lockutils [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.743 186180 INFO nova.compute.manager [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.773 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.813 186180 INFO nova.scheduler.client.report [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Deleted allocation for migration 22a7b629-663e-49c2-ba7d-d8b941ae1bff
Feb 16 17:47:39 compute-0 nova_compute[186176]: 2026-02-16 17:47:39.814 186180 DEBUG nova.virt.libvirt.driver [None req-66092f36-7ed6-4c7b-953f-3f7623d3a326 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 17:47:40 compute-0 sshd-session[214903]: Connection closed by 2.57.122.210 port 55062
Feb 16 17:47:41 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:47:41.197 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:47:43 compute-0 podman[214905]: 2026-02-16 17:47:43.108744395 +0000 UTC m=+0.067182418 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 17:47:43 compute-0 podman[214904]: 2026-02-16 17:47:43.156130156 +0000 UTC m=+0.114489057 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:47:43 compute-0 nova_compute[186176]: 2026-02-16 17:47:43.230 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:44 compute-0 nova_compute[186176]: 2026-02-16 17:47:44.797 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:46 compute-0 nova_compute[186176]: 2026-02-16 17:47:46.310 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771264051.3085344, 6b97d70c-fca4-4b8e-8381-e32928f2a1f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:47:46 compute-0 nova_compute[186176]: 2026-02-16 17:47:46.310 186180 INFO nova.compute.manager [-] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] VM Stopped (Lifecycle Event)
Feb 16 17:47:46 compute-0 nova_compute[186176]: 2026-02-16 17:47:46.336 186180 DEBUG nova.compute.manager [None req-dab19dbc-369e-4d76-a5fc-c84f1466383e - - - - - -] [instance: 6b97d70c-fca4-4b8e-8381-e32928f2a1f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:47:48 compute-0 nova_compute[186176]: 2026-02-16 17:47:48.233 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:49 compute-0 nova_compute[186176]: 2026-02-16 17:47:49.801 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:53 compute-0 nova_compute[186176]: 2026-02-16 17:47:53.237 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:53 compute-0 nova_compute[186176]: 2026-02-16 17:47:53.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:53 compute-0 nova_compute[186176]: 2026-02-16 17:47:53.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:47:53 compute-0 nova_compute[186176]: 2026-02-16 17:47:53.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:47:53 compute-0 nova_compute[186176]: 2026-02-16 17:47:53.331 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:47:53 compute-0 nova_compute[186176]: 2026-02-16 17:47:53.332 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:54 compute-0 nova_compute[186176]: 2026-02-16 17:47:54.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:54 compute-0 nova_compute[186176]: 2026-02-16 17:47:54.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:47:54 compute-0 nova_compute[186176]: 2026-02-16 17:47:54.804 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.348 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.350 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.564 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.565 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5797MB free_disk=73.22297668457031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.566 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.566 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.686 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.687 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.712 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.731 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.732 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:47:55 compute-0 nova_compute[186176]: 2026-02-16 17:47:55.732 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:47:56 compute-0 nova_compute[186176]: 2026-02-16 17:47:56.733 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:57 compute-0 nova_compute[186176]: 2026-02-16 17:47:57.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:58 compute-0 nova_compute[186176]: 2026-02-16 17:47:58.239 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:47:58 compute-0 nova_compute[186176]: 2026-02-16 17:47:58.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:47:59 compute-0 podman[195505]: time="2026-02-16T17:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:47:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:47:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Feb 16 17:47:59 compute-0 nova_compute[186176]: 2026-02-16 17:47:59.811 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:01 compute-0 openstack_network_exporter[198360]: ERROR   17:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:48:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:48:01 compute-0 openstack_network_exporter[198360]: ERROR   17:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:48:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:48:02 compute-0 nova_compute[186176]: 2026-02-16 17:48:02.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:48:03 compute-0 nova_compute[186176]: 2026-02-16 17:48:03.243 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:04 compute-0 nova_compute[186176]: 2026-02-16 17:48:04.829 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:05 compute-0 podman[214952]: 2026-02-16 17:48:05.080543322 +0000 UTC m=+0.052464767 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal)
Feb 16 17:48:08 compute-0 podman[214974]: 2026-02-16 17:48:08.128123981 +0000 UTC m=+0.095888642 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 16 17:48:08 compute-0 nova_compute[186176]: 2026-02-16 17:48:08.246 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:08 compute-0 nova_compute[186176]: 2026-02-16 17:48:08.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:48:09 compute-0 nova_compute[186176]: 2026-02-16 17:48:09.831 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:13 compute-0 nova_compute[186176]: 2026-02-16 17:48:13.287 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:14 compute-0 systemd[1]: Starting dnf makecache...
Feb 16 17:48:14 compute-0 podman[214994]: 2026-02-16 17:48:14.114506554 +0000 UTC m=+0.071428752 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:48:14 compute-0 podman[214993]: 2026-02-16 17:48:14.158715108 +0000 UTC m=+0.127309002 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 17:48:14 compute-0 dnf[215000]: Metadata cache refreshed recently.
Feb 16 17:48:14 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 16 17:48:14 compute-0 systemd[1]: Finished dnf makecache.
Feb 16 17:48:14 compute-0 nova_compute[186176]: 2026-02-16 17:48:14.835 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:15 compute-0 ovn_controller[96437]: 2026-02-16T17:48:15Z|00194|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 16 17:48:18 compute-0 nova_compute[186176]: 2026-02-16 17:48:18.332 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:19 compute-0 nova_compute[186176]: 2026-02-16 17:48:19.836 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:22 compute-0 nova_compute[186176]: 2026-02-16 17:48:22.283 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:23 compute-0 nova_compute[186176]: 2026-02-16 17:48:23.335 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:24 compute-0 nova_compute[186176]: 2026-02-16 17:48:24.837 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:28 compute-0 nova_compute[186176]: 2026-02-16 17:48:28.387 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:29 compute-0 podman[195505]: time="2026-02-16T17:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:48:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:48:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2174 "" "Go-http-client/1.1"
Feb 16 17:48:29 compute-0 nova_compute[186176]: 2026-02-16 17:48:29.840 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:31 compute-0 openstack_network_exporter[198360]: ERROR   17:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:48:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:48:31 compute-0 openstack_network_exporter[198360]: ERROR   17:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:48:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:48:33 compute-0 nova_compute[186176]: 2026-02-16 17:48:33.391 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:34 compute-0 nova_compute[186176]: 2026-02-16 17:48:34.841 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:34 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:48:34.973 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:48:34 compute-0 nova_compute[186176]: 2026-02-16 17:48:34.974 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:34 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:48:34.975 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:48:35 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:48:35.976 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:48:36 compute-0 podman[215044]: 2026-02-16 17:48:36.076519938 +0000 UTC m=+0.051762255 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_id=openstack_network_exporter, version=9.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 17:48:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:48:38.182 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:48:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:48:38.183 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:48:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:48:38.183 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:48:38 compute-0 nova_compute[186176]: 2026-02-16 17:48:38.392 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:39 compute-0 podman[215065]: 2026-02-16 17:48:39.102447172 +0000 UTC m=+0.070048905 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 16 17:48:39 compute-0 nova_compute[186176]: 2026-02-16 17:48:39.841 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:43 compute-0 nova_compute[186176]: 2026-02-16 17:48:43.394 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:44 compute-0 nova_compute[186176]: 2026-02-16 17:48:44.865 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:45 compute-0 podman[215087]: 2026-02-16 17:48:45.102666529 +0000 UTC m=+0.069111741 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:48:45 compute-0 podman[215086]: 2026-02-16 17:48:45.119800301 +0000 UTC m=+0.090757354 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:48:48 compute-0 nova_compute[186176]: 2026-02-16 17:48:48.397 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:49 compute-0 nova_compute[186176]: 2026-02-16 17:48:49.872 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:53 compute-0 nova_compute[186176]: 2026-02-16 17:48:53.311 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:48:53 compute-0 nova_compute[186176]: 2026-02-16 17:48:53.400 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:54 compute-0 nova_compute[186176]: 2026-02-16 17:48:54.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:48:54 compute-0 nova_compute[186176]: 2026-02-16 17:48:54.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:48:54 compute-0 nova_compute[186176]: 2026-02-16 17:48:54.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:48:54 compute-0 nova_compute[186176]: 2026-02-16 17:48:54.333 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:48:54 compute-0 nova_compute[186176]: 2026-02-16 17:48:54.334 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:48:54 compute-0 nova_compute[186176]: 2026-02-16 17:48:54.907 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:55 compute-0 nova_compute[186176]: 2026-02-16 17:48:55.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:48:55 compute-0 nova_compute[186176]: 2026-02-16 17:48:55.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.350 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.350 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.350 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.528 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.529 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5804MB free_disk=73.22294998168945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.530 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.530 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.602 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.602 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.658 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.671 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.672 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:48:56 compute-0 nova_compute[186176]: 2026-02-16 17:48:56.672 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:48:58 compute-0 nova_compute[186176]: 2026-02-16 17:48:58.433 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:48:58 compute-0 nova_compute[186176]: 2026-02-16 17:48:58.667 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:48:59 compute-0 podman[195505]: time="2026-02-16T17:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:48:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:48:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Feb 16 17:48:59 compute-0 nova_compute[186176]: 2026-02-16 17:48:59.909 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:00 compute-0 nova_compute[186176]: 2026-02-16 17:49:00.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:49:01 compute-0 openstack_network_exporter[198360]: ERROR   17:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:49:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:49:01 compute-0 openstack_network_exporter[198360]: ERROR   17:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:49:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:49:02 compute-0 nova_compute[186176]: 2026-02-16 17:49:02.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:49:03 compute-0 nova_compute[186176]: 2026-02-16 17:49:03.436 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:04 compute-0 nova_compute[186176]: 2026-02-16 17:49:04.911 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:06 compute-0 ovn_controller[96437]: 2026-02-16T17:49:06Z|00195|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 16 17:49:07 compute-0 podman[215137]: 2026-02-16 17:49:07.199431464 +0000 UTC m=+0.073756616 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=openstack_network_exporter, release=1770267347, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, distribution-scope=public, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 16 17:49:08 compute-0 nova_compute[186176]: 2026-02-16 17:49:08.438 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:09 compute-0 nova_compute[186176]: 2026-02-16 17:49:09.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:49:09 compute-0 nova_compute[186176]: 2026-02-16 17:49:09.912 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:10 compute-0 podman[215158]: 2026-02-16 17:49:10.077008378 +0000 UTC m=+0.050660128 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.450 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.450 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.465 186180 DEBUG nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.617 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.618 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.624 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.624 186180 INFO nova.compute.claims [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.733 186180 DEBUG nova.compute.provider_tree [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.761 186180 DEBUG nova.scheduler.client.report [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.791 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.792 186180 DEBUG nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.909 186180 DEBUG nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.909 186180 DEBUG nova.network.neutron [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.934 186180 INFO nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:49:11 compute-0 nova_compute[186176]: 2026-02-16 17:49:11.956 186180 DEBUG nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.064 186180 DEBUG nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.065 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.066 186180 INFO nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Creating image(s)
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.066 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "/var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.066 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "/var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.067 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "/var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.079 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.126 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.126 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.127 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.137 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.198 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.199 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.221 186180 DEBUG nova.policy [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aace4ef5f521473ca481eaa58a289951', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd0f8251f0a9a482d879e8298c02a9652', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.227 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk 1073741824" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.227 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.228 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.278 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.279 186180 DEBUG nova.virt.disk.api [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Checking if we can resize image /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.280 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.345 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.347 186180 DEBUG nova.virt.disk.api [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Cannot resize image /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.347 186180 DEBUG nova.objects.instance [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lazy-loading 'migration_context' on Instance uuid ad140b46-c259-4541-b51e-ad0fcc4c26d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.364 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.365 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Ensure instance console log exists: /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.366 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.366 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.367 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:12 compute-0 nova_compute[186176]: 2026-02-16 17:49:12.976 186180 DEBUG nova.network.neutron [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Successfully created port: 322cec24-ae8f-4853-aca6-9609258d7523 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:49:13 compute-0 nova_compute[186176]: 2026-02-16 17:49:13.441 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:14 compute-0 nova_compute[186176]: 2026-02-16 17:49:14.794 186180 DEBUG nova.network.neutron [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Successfully updated port: 322cec24-ae8f-4853-aca6-9609258d7523 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:49:14 compute-0 nova_compute[186176]: 2026-02-16 17:49:14.816 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "refresh_cache-ad140b46-c259-4541-b51e-ad0fcc4c26d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:49:14 compute-0 nova_compute[186176]: 2026-02-16 17:49:14.817 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquired lock "refresh_cache-ad140b46-c259-4541-b51e-ad0fcc4c26d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:49:14 compute-0 nova_compute[186176]: 2026-02-16 17:49:14.817 186180 DEBUG nova.network.neutron [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:49:14 compute-0 nova_compute[186176]: 2026-02-16 17:49:14.908 186180 DEBUG nova.compute.manager [req-881835c3-1a5d-4c35-9fb6-96fb87e6d3e2 req-9ab4cef3-5da7-4ebc-943e-7457c4faa8b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Received event network-changed-322cec24-ae8f-4853-aca6-9609258d7523 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:49:14 compute-0 nova_compute[186176]: 2026-02-16 17:49:14.909 186180 DEBUG nova.compute.manager [req-881835c3-1a5d-4c35-9fb6-96fb87e6d3e2 req-9ab4cef3-5da7-4ebc-943e-7457c4faa8b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Refreshing instance network info cache due to event network-changed-322cec24-ae8f-4853-aca6-9609258d7523. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:49:14 compute-0 nova_compute[186176]: 2026-02-16 17:49:14.909 186180 DEBUG oslo_concurrency.lockutils [req-881835c3-1a5d-4c35-9fb6-96fb87e6d3e2 req-9ab4cef3-5da7-4ebc-943e-7457c4faa8b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-ad140b46-c259-4541-b51e-ad0fcc4c26d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:49:14 compute-0 nova_compute[186176]: 2026-02-16 17:49:14.913 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:14 compute-0 nova_compute[186176]: 2026-02-16 17:49:14.981 186180 DEBUG nova.network.neutron [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:49:16 compute-0 podman[215196]: 2026-02-16 17:49:16.098922219 +0000 UTC m=+0.071625294 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:49:16 compute-0 podman[215195]: 2026-02-16 17:49:16.149090834 +0000 UTC m=+0.122214129 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.210 186180 DEBUG nova.network.neutron [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Updating instance_info_cache with network_info: [{"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.231 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Releasing lock "refresh_cache-ad140b46-c259-4541-b51e-ad0fcc4c26d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.232 186180 DEBUG nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Instance network_info: |[{"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.233 186180 DEBUG oslo_concurrency.lockutils [req-881835c3-1a5d-4c35-9fb6-96fb87e6d3e2 req-9ab4cef3-5da7-4ebc-943e-7457c4faa8b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-ad140b46-c259-4541-b51e-ad0fcc4c26d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.233 186180 DEBUG nova.network.neutron [req-881835c3-1a5d-4c35-9fb6-96fb87e6d3e2 req-9ab4cef3-5da7-4ebc-943e-7457c4faa8b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Refreshing network info cache for port 322cec24-ae8f-4853-aca6-9609258d7523 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.239 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Start _get_guest_xml network_info=[{"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.245 186180 WARNING nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.253 186180 DEBUG nova.virt.libvirt.host [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.254 186180 DEBUG nova.virt.libvirt.host [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.265 186180 DEBUG nova.virt.libvirt.host [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.266 186180 DEBUG nova.virt.libvirt.host [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.268 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.268 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.269 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.270 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.270 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.270 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.271 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.271 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.272 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.272 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.272 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.273 186180 DEBUG nova.virt.hardware [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.280 186180 DEBUG nova.virt.libvirt.vif [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:49:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-666654609',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-666654609',id=25,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0f8251f0a9a482d879e8298c02a9652',ramdisk_id='',reservation_id='r-w0i03c7g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-485487738',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-485487738-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:49:11Z,user_data=None,user_id='aace4ef5f521473ca481eaa58a289951',uuid=ad140b46-c259-4541-b51e-ad0fcc4c26d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.280 186180 DEBUG nova.network.os_vif_util [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Converting VIF {"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.281 186180 DEBUG nova.network.os_vif_util [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:d5:1d,bridge_name='br-int',has_traffic_filtering=True,id=322cec24-ae8f-4853-aca6-9609258d7523,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap322cec24-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.283 186180 DEBUG nova.objects.instance [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lazy-loading 'pci_devices' on Instance uuid ad140b46-c259-4541-b51e-ad0fcc4c26d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.305 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <uuid>ad140b46-c259-4541-b51e-ad0fcc4c26d4</uuid>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <name>instance-00000019</name>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-666654609</nova:name>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:49:17</nova:creationTime>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:49:17 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:49:17 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:49:17 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:49:17 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:49:17 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:49:17 compute-0 nova_compute[186176]:         <nova:user uuid="aace4ef5f521473ca481eaa58a289951">tempest-TestExecuteVmWorkloadBalanceStrategy-485487738-project-member</nova:user>
Feb 16 17:49:17 compute-0 nova_compute[186176]:         <nova:project uuid="d0f8251f0a9a482d879e8298c02a9652">tempest-TestExecuteVmWorkloadBalanceStrategy-485487738</nova:project>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:49:17 compute-0 nova_compute[186176]:         <nova:port uuid="322cec24-ae8f-4853-aca6-9609258d7523">
Feb 16 17:49:17 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <system>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <entry name="serial">ad140b46-c259-4541-b51e-ad0fcc4c26d4</entry>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <entry name="uuid">ad140b46-c259-4541-b51e-ad0fcc4c26d4</entry>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     </system>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <os>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   </os>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <features>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   </features>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk.config"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:88:d5:1d"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <target dev="tap322cec24-ae"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/console.log" append="off"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <video>
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     </video>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:49:17 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:49:17 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:49:17 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:49:17 compute-0 nova_compute[186176]: </domain>
Feb 16 17:49:17 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.307 186180 DEBUG nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Preparing to wait for external event network-vif-plugged-322cec24-ae8f-4853-aca6-9609258d7523 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.307 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.307 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.308 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.309 186180 DEBUG nova.virt.libvirt.vif [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:49:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-666654609',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-666654609',id=25,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0f8251f0a9a482d879e8298c02a9652',ramdisk_id='',reservation_id='r-w0i03c7g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-485487738',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-485487738-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:49:11Z,user_data=None,user_id='aace4ef5f521473ca481eaa58a289951',uuid=ad140b46-c259-4541-b51e-ad0fcc4c26d4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.309 186180 DEBUG nova.network.os_vif_util [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Converting VIF {"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.310 186180 DEBUG nova.network.os_vif_util [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:88:d5:1d,bridge_name='br-int',has_traffic_filtering=True,id=322cec24-ae8f-4853-aca6-9609258d7523,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap322cec24-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.311 186180 DEBUG os_vif [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:d5:1d,bridge_name='br-int',has_traffic_filtering=True,id=322cec24-ae8f-4853-aca6-9609258d7523,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap322cec24-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.312 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.312 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.313 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.317 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.317 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap322cec24-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.318 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap322cec24-ae, col_values=(('external_ids', {'iface-id': '322cec24-ae8f-4853-aca6-9609258d7523', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:88:d5:1d', 'vm-uuid': 'ad140b46-c259-4541-b51e-ad0fcc4c26d4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.320 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:17 compute-0 NetworkManager[56463]: <info>  [1771264157.3220] manager: (tap322cec24-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.322 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.327 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.329 186180 INFO os_vif [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:88:d5:1d,bridge_name='br-int',has_traffic_filtering=True,id=322cec24-ae8f-4853-aca6-9609258d7523,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap322cec24-ae')
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.392 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.393 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.394 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] No VIF found with MAC fa:16:3e:88:d5:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:49:17 compute-0 nova_compute[186176]: 2026-02-16 17:49:17.395 186180 INFO nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Using config drive
Feb 16 17:49:18 compute-0 nova_compute[186176]: 2026-02-16 17:49:18.273 186180 INFO nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Creating config drive at /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk.config
Feb 16 17:49:18 compute-0 nova_compute[186176]: 2026-02-16 17:49:18.276 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpy61ne6v_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:49:18 compute-0 nova_compute[186176]: 2026-02-16 17:49:18.398 186180 DEBUG oslo_concurrency.processutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpy61ne6v_" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:49:18 compute-0 kernel: tap322cec24-ae: entered promiscuous mode
Feb 16 17:49:18 compute-0 NetworkManager[56463]: <info>  [1771264158.4638] manager: (tap322cec24-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Feb 16 17:49:18 compute-0 nova_compute[186176]: 2026-02-16 17:49:18.500 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:18 compute-0 ovn_controller[96437]: 2026-02-16T17:49:18Z|00196|binding|INFO|Claiming lport 322cec24-ae8f-4853-aca6-9609258d7523 for this chassis.
Feb 16 17:49:18 compute-0 ovn_controller[96437]: 2026-02-16T17:49:18Z|00197|binding|INFO|322cec24-ae8f-4853-aca6-9609258d7523: Claiming fa:16:3e:88:d5:1d 10.100.0.14
Feb 16 17:49:18 compute-0 systemd-udevd[215264]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.527 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:d5:1d 10.100.0.14'], port_security=['fa:16:3e:88:d5:1d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ad140b46-c259-4541-b51e-ad0fcc4c26d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0f8251f0a9a482d879e8298c02a9652', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f95a2b3-a7bc-49f8-9945-a529603420cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=092b2aac-8232-424c-92e7-57054cbc7fed, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=322cec24-ae8f-4853-aca6-9609258d7523) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.528 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 322cec24-ae8f-4853-aca6-9609258d7523 in datapath 9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e bound to our chassis
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.529 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e
Feb 16 17:49:18 compute-0 systemd-machined[155631]: New machine qemu-19-instance-00000019.
Feb 16 17:49:18 compute-0 ovn_controller[96437]: 2026-02-16T17:49:18Z|00198|binding|INFO|Setting lport 322cec24-ae8f-4853-aca6-9609258d7523 ovn-installed in OVS
Feb 16 17:49:18 compute-0 ovn_controller[96437]: 2026-02-16T17:49:18Z|00199|binding|INFO|Setting lport 322cec24-ae8f-4853-aca6-9609258d7523 up in Southbound
Feb 16 17:49:18 compute-0 nova_compute[186176]: 2026-02-16 17:49:18.542 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.542 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[5e97e7ce-d3bd-46c6-86f3-14716492ed63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.543 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9fcb1bf9-d1 in ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:49:18 compute-0 NetworkManager[56463]: <info>  [1771264158.5465] device (tap322cec24-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:49:18 compute-0 NetworkManager[56463]: <info>  [1771264158.5482] device (tap322cec24-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.546 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9fcb1bf9-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.546 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9d9cd3-778c-452f-9864-0487fd5c43b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.547 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[20437440-1b22-4fa7-9ce8-01a3b58e1439]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000019.
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.562 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[1716ecb2-f798-4d95-ab0f-d461fa8603b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.590 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f164e8f0-8204-430d-9121-4281d594afd0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.615 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[8e31c8b1-94d7-405d-8611-e75358308c6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.620 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[43d45350-c1c0-44b4-af56-6d0dcd548299]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 NetworkManager[56463]: <info>  [1771264158.6222] manager: (tap9fcb1bf9-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Feb 16 17:49:18 compute-0 systemd-udevd[215268]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.650 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[1d517ba1-579c-4a51-97de-f867280f1acb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.654 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[5006c72d-7c80-4478-9c6b-d93498f31abe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 NetworkManager[56463]: <info>  [1771264158.6711] device (tap9fcb1bf9-d0): carrier: link connected
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.680 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[ebbdebd9-19b9-4a59-aaa7-b9538689b8ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.693 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[5221d06c-f42c-436d-895c-57a81a1a2151]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9fcb1bf9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:7d:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575419, 'reachable_time': 42651, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215298, 'error': None, 'target': 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.708 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[118bb222-17bf-49a2-affa-f62d00bce9df]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feca:7d8e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 575419, 'tstamp': 575419}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215299, 'error': None, 'target': 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.724 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[c1168762-8d96-4b3d-8d85-b57cfd7bbbbd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9fcb1bf9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:7d:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575419, 'reachable_time': 42651, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215300, 'error': None, 'target': 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.750 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[740fb14b-9ac7-4b5a-8562-d6b911d0707f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.803 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[7a77f5a4-9281-48bc-8fa5-8e1d2cb4f7c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.808 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fcb1bf9-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.809 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.810 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9fcb1bf9-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:49:18 compute-0 kernel: tap9fcb1bf9-d0: entered promiscuous mode
Feb 16 17:49:18 compute-0 NetworkManager[56463]: <info>  [1771264158.8131] manager: (tap9fcb1bf9-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Feb 16 17:49:18 compute-0 nova_compute[186176]: 2026-02-16 17:49:18.812 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.820 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9fcb1bf9-d0, col_values=(('external_ids', {'iface-id': '613be906-2a79-4164-80b2-078ce66608ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:49:18 compute-0 ovn_controller[96437]: 2026-02-16T17:49:18Z|00200|binding|INFO|Releasing lport 613be906-2a79-4164-80b2-078ce66608ba from this chassis (sb_readonly=0)
Feb 16 17:49:18 compute-0 nova_compute[186176]: 2026-02-16 17:49:18.822 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.827 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:49:18 compute-0 nova_compute[186176]: 2026-02-16 17:49:18.831 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.829 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f6fc24f7-47b5-4896-9161-c2f703b8b4a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.832 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e.pid.haproxy
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:49:18 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:18.833 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'env', 'PROCESS_TAG=haproxy-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:49:18 compute-0 nova_compute[186176]: 2026-02-16 17:49:18.995 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264158.9945667, ad140b46-c259-4541-b51e-ad0fcc4c26d4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:49:18 compute-0 nova_compute[186176]: 2026-02-16 17:49:18.996 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] VM Started (Lifecycle Event)
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.021 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.026 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264158.9949038, ad140b46-c259-4541-b51e-ad0fcc4c26d4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.027 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] VM Paused (Lifecycle Event)
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.049 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.053 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.077 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:49:19 compute-0 podman[215339]: 2026-02-16 17:49:19.180758419 +0000 UTC m=+0.050437871 container create 292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:49:19 compute-0 systemd[1]: Started libpod-conmon-292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b.scope.
Feb 16 17:49:19 compute-0 podman[215339]: 2026-02-16 17:49:19.149623154 +0000 UTC m=+0.019302596 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:49:19 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:49:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90ee2e15ed80bec4af51bf43ec9fa4866b1c204cbd204f0df5ad6b7450039af9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.256 186180 DEBUG nova.compute.manager [req-a5627e6b-4c73-4bfe-91f9-a74bc64bc960 req-31c85951-1914-4a78-9f54-0dce1f75fd53 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Received event network-vif-plugged-322cec24-ae8f-4853-aca6-9609258d7523 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.257 186180 DEBUG oslo_concurrency.lockutils [req-a5627e6b-4c73-4bfe-91f9-a74bc64bc960 req-31c85951-1914-4a78-9f54-0dce1f75fd53 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.257 186180 DEBUG oslo_concurrency.lockutils [req-a5627e6b-4c73-4bfe-91f9-a74bc64bc960 req-31c85951-1914-4a78-9f54-0dce1f75fd53 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.257 186180 DEBUG oslo_concurrency.lockutils [req-a5627e6b-4c73-4bfe-91f9-a74bc64bc960 req-31c85951-1914-4a78-9f54-0dce1f75fd53 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.258 186180 DEBUG nova.compute.manager [req-a5627e6b-4c73-4bfe-91f9-a74bc64bc960 req-31c85951-1914-4a78-9f54-0dce1f75fd53 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Processing event network-vif-plugged-322cec24-ae8f-4853-aca6-9609258d7523 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.259 186180 DEBUG nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.263 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264159.2636557, ad140b46-c259-4541-b51e-ad0fcc4c26d4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.264 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] VM Resumed (Lifecycle Event)
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.266 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:49:19 compute-0 podman[215339]: 2026-02-16 17:49:19.272301012 +0000 UTC m=+0.141980524 container init 292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.271 186180 INFO nova.virt.libvirt.driver [-] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Instance spawned successfully.
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.273 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:49:19 compute-0 podman[215339]: 2026-02-16 17:49:19.28158291 +0000 UTC m=+0.151262342 container start 292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.293 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:49:19 compute-0 neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e[215354]: [NOTICE]   (215358) : New worker (215360) forked
Feb 16 17:49:19 compute-0 neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e[215354]: [NOTICE]   (215358) : Loading success.
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.304 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.311 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.312 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.313 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.314 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.315 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.316 186180 DEBUG nova.virt.libvirt.driver [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.330 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.419 186180 INFO nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Took 7.35 seconds to spawn the instance on the hypervisor.
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.420 186180 DEBUG nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.516 186180 INFO nova.compute.manager [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Took 7.94 seconds to build instance.
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.543 186180 DEBUG oslo_concurrency.lockutils [None req-16111558-1403-47e6-8125-7c1d3f249d5c aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:19 compute-0 nova_compute[186176]: 2026-02-16 17:49:19.919 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:20 compute-0 nova_compute[186176]: 2026-02-16 17:49:20.029 186180 DEBUG nova.network.neutron [req-881835c3-1a5d-4c35-9fb6-96fb87e6d3e2 req-9ab4cef3-5da7-4ebc-943e-7457c4faa8b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Updated VIF entry in instance network info cache for port 322cec24-ae8f-4853-aca6-9609258d7523. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:49:20 compute-0 nova_compute[186176]: 2026-02-16 17:49:20.030 186180 DEBUG nova.network.neutron [req-881835c3-1a5d-4c35-9fb6-96fb87e6d3e2 req-9ab4cef3-5da7-4ebc-943e-7457c4faa8b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Updating instance_info_cache with network_info: [{"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:49:20 compute-0 nova_compute[186176]: 2026-02-16 17:49:20.060 186180 DEBUG oslo_concurrency.lockutils [req-881835c3-1a5d-4c35-9fb6-96fb87e6d3e2 req-9ab4cef3-5da7-4ebc-943e-7457c4faa8b2 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-ad140b46-c259-4541-b51e-ad0fcc4c26d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:49:21 compute-0 nova_compute[186176]: 2026-02-16 17:49:21.578 186180 DEBUG nova.compute.manager [req-7bbb6ffd-8376-4d84-bd35-bbf54881d427 req-4ab63a85-8a5a-494c-9270-3bd96d48b9c7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Received event network-vif-plugged-322cec24-ae8f-4853-aca6-9609258d7523 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:49:21 compute-0 nova_compute[186176]: 2026-02-16 17:49:21.579 186180 DEBUG oslo_concurrency.lockutils [req-7bbb6ffd-8376-4d84-bd35-bbf54881d427 req-4ab63a85-8a5a-494c-9270-3bd96d48b9c7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:21 compute-0 nova_compute[186176]: 2026-02-16 17:49:21.579 186180 DEBUG oslo_concurrency.lockutils [req-7bbb6ffd-8376-4d84-bd35-bbf54881d427 req-4ab63a85-8a5a-494c-9270-3bd96d48b9c7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:21 compute-0 nova_compute[186176]: 2026-02-16 17:49:21.580 186180 DEBUG oslo_concurrency.lockutils [req-7bbb6ffd-8376-4d84-bd35-bbf54881d427 req-4ab63a85-8a5a-494c-9270-3bd96d48b9c7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:21 compute-0 nova_compute[186176]: 2026-02-16 17:49:21.580 186180 DEBUG nova.compute.manager [req-7bbb6ffd-8376-4d84-bd35-bbf54881d427 req-4ab63a85-8a5a-494c-9270-3bd96d48b9c7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] No waiting events found dispatching network-vif-plugged-322cec24-ae8f-4853-aca6-9609258d7523 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:49:21 compute-0 nova_compute[186176]: 2026-02-16 17:49:21.580 186180 WARNING nova.compute.manager [req-7bbb6ffd-8376-4d84-bd35-bbf54881d427 req-4ab63a85-8a5a-494c-9270-3bd96d48b9c7 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Received unexpected event network-vif-plugged-322cec24-ae8f-4853-aca6-9609258d7523 for instance with vm_state active and task_state None.
Feb 16 17:49:22 compute-0 nova_compute[186176]: 2026-02-16 17:49:22.322 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:24 compute-0 nova_compute[186176]: 2026-02-16 17:49:24.921 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:27 compute-0 nova_compute[186176]: 2026-02-16 17:49:27.326 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:29 compute-0 podman[195505]: time="2026-02-16T17:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:49:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:49:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2641 "" "Go-http-client/1.1"
Feb 16 17:49:29 compute-0 nova_compute[186176]: 2026-02-16 17:49:29.925 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:31 compute-0 openstack_network_exporter[198360]: ERROR   17:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:49:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:49:31 compute-0 openstack_network_exporter[198360]: ERROR   17:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:49:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:49:31 compute-0 ovn_controller[96437]: 2026-02-16T17:49:31Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:88:d5:1d 10.100.0.14
Feb 16 17:49:31 compute-0 ovn_controller[96437]: 2026-02-16T17:49:31Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:88:d5:1d 10.100.0.14
Feb 16 17:49:32 compute-0 nova_compute[186176]: 2026-02-16 17:49:32.329 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:34 compute-0 nova_compute[186176]: 2026-02-16 17:49:34.925 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:37 compute-0 nova_compute[186176]: 2026-02-16 17:49:37.332 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:38 compute-0 podman[215381]: 2026-02-16 17:49:38.103182578 +0000 UTC m=+0.075247193 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, name=ubi9/ubi-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 16 17:49:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:38.183 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:38.184 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:38.184 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:39.187 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:49:39 compute-0 nova_compute[186176]: 2026-02-16 17:49:39.188 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:39.189 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:49:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:49:39.191 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:49:39 compute-0 nova_compute[186176]: 2026-02-16 17:49:39.928 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:41 compute-0 podman[215403]: 2026-02-16 17:49:41.101175493 +0000 UTC m=+0.065285958 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 17:49:42 compute-0 nova_compute[186176]: 2026-02-16 17:49:42.336 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:44 compute-0 nova_compute[186176]: 2026-02-16 17:49:44.929 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:47 compute-0 podman[215424]: 2026-02-16 17:49:47.121327641 +0000 UTC m=+0.074619347 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:49:47 compute-0 podman[215423]: 2026-02-16 17:49:47.147723551 +0000 UTC m=+0.107922067 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 16 17:49:47 compute-0 nova_compute[186176]: 2026-02-16 17:49:47.385 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:49 compute-0 nova_compute[186176]: 2026-02-16 17:49:49.932 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:52 compute-0 nova_compute[186176]: 2026-02-16 17:49:52.389 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:54 compute-0 nova_compute[186176]: 2026-02-16 17:49:54.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:49:54 compute-0 nova_compute[186176]: 2026-02-16 17:49:54.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:49:54 compute-0 nova_compute[186176]: 2026-02-16 17:49:54.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:49:54 compute-0 nova_compute[186176]: 2026-02-16 17:49:54.935 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:55 compute-0 nova_compute[186176]: 2026-02-16 17:49:55.035 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-ad140b46-c259-4541-b51e-ad0fcc4c26d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:49:55 compute-0 nova_compute[186176]: 2026-02-16 17:49:55.036 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-ad140b46-c259-4541-b51e-ad0fcc4c26d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:49:55 compute-0 nova_compute[186176]: 2026-02-16 17:49:55.036 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:49:55 compute-0 nova_compute[186176]: 2026-02-16 17:49:55.037 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid ad140b46-c259-4541-b51e-ad0fcc4c26d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:49:57 compute-0 nova_compute[186176]: 2026-02-16 17:49:57.055 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Updating instance_info_cache with network_info: [{"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:49:57 compute-0 nova_compute[186176]: 2026-02-16 17:49:57.074 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-ad140b46-c259-4541-b51e-ad0fcc4c26d4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:49:57 compute-0 nova_compute[186176]: 2026-02-16 17:49:57.075 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:49:57 compute-0 nova_compute[186176]: 2026-02-16 17:49:57.076 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:49:57 compute-0 nova_compute[186176]: 2026-02-16 17:49:57.077 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:49:57 compute-0 nova_compute[186176]: 2026-02-16 17:49:57.077 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:49:57 compute-0 nova_compute[186176]: 2026-02-16 17:49:57.077 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:49:57 compute-0 nova_compute[186176]: 2026-02-16 17:49:57.426 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.348 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.349 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.349 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.427 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.507 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.508 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.573 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.754 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.755 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5642MB free_disk=73.19388580322266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.756 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.756 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.959 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance ad140b46-c259-4541-b51e-ad0fcc4c26d4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.959 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.959 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:49:58 compute-0 nova_compute[186176]: 2026-02-16 17:49:58.979 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 17:49:59 compute-0 nova_compute[186176]: 2026-02-16 17:49:59.001 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 17:49:59 compute-0 nova_compute[186176]: 2026-02-16 17:49:59.001 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:49:59 compute-0 nova_compute[186176]: 2026-02-16 17:49:59.029 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 17:49:59 compute-0 nova_compute[186176]: 2026-02-16 17:49:59.069 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 17:49:59 compute-0 nova_compute[186176]: 2026-02-16 17:49:59.113 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:49:59 compute-0 nova_compute[186176]: 2026-02-16 17:49:59.134 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:49:59 compute-0 nova_compute[186176]: 2026-02-16 17:49:59.163 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:49:59 compute-0 nova_compute[186176]: 2026-02-16 17:49:59.164 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:49:59 compute-0 podman[195505]: time="2026-02-16T17:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:49:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:49:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2645 "" "Go-http-client/1.1"
Feb 16 17:49:59 compute-0 ovn_controller[96437]: 2026-02-16T17:49:59Z|00201|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Feb 16 17:49:59 compute-0 nova_compute[186176]: 2026-02-16 17:49:59.938 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:01 compute-0 openstack_network_exporter[198360]: ERROR   17:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:50:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:50:01 compute-0 openstack_network_exporter[198360]: ERROR   17:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:50:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:50:02 compute-0 nova_compute[186176]: 2026-02-16 17:50:02.164 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:50:02 compute-0 nova_compute[186176]: 2026-02-16 17:50:02.429 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:03 compute-0 nova_compute[186176]: 2026-02-16 17:50:03.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:50:04 compute-0 nova_compute[186176]: 2026-02-16 17:50:04.941 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:07 compute-0 nova_compute[186176]: 2026-02-16 17:50:07.481 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:09 compute-0 podman[215479]: 2026-02-16 17:50:09.108999066 +0000 UTC m=+0.076426962 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, release=1770267347, version=9.7, config_id=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Feb 16 17:50:09 compute-0 nova_compute[186176]: 2026-02-16 17:50:09.944 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:10 compute-0 nova_compute[186176]: 2026-02-16 17:50:10.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:50:12 compute-0 podman[215500]: 2026-02-16 17:50:12.107172187 +0000 UTC m=+0.072006833 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 16 17:50:12 compute-0 nova_compute[186176]: 2026-02-16 17:50:12.486 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:14 compute-0 nova_compute[186176]: 2026-02-16 17:50:14.947 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:17 compute-0 nova_compute[186176]: 2026-02-16 17:50:17.490 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:18 compute-0 podman[215521]: 2026-02-16 17:50:18.125573304 +0000 UTC m=+0.093696317 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:50:18 compute-0 podman[215520]: 2026-02-16 17:50:18.131659334 +0000 UTC m=+0.099488769 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 16 17:50:20 compute-0 nova_compute[186176]: 2026-02-16 17:50:20.001 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:22 compute-0 nova_compute[186176]: 2026-02-16 17:50:22.527 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:25 compute-0 nova_compute[186176]: 2026-02-16 17:50:25.002 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:25 compute-0 nova_compute[186176]: 2026-02-16 17:50:25.903 186180 DEBUG nova.virt.libvirt.driver [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Creating tmpfile /var/lib/nova/instances/tmpn30ckiub to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 16 17:50:25 compute-0 nova_compute[186176]: 2026-02-16 17:50:25.906 186180 DEBUG nova.compute.manager [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn30ckiub',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 16 17:50:27 compute-0 nova_compute[186176]: 2026-02-16 17:50:27.531 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:28 compute-0 nova_compute[186176]: 2026-02-16 17:50:28.277 186180 DEBUG nova.compute.manager [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn30ckiub',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='96b40564-34c8-4e82-b296-76a49bc59876',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 16 17:50:28 compute-0 nova_compute[186176]: 2026-02-16 17:50:28.301 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-96b40564-34c8-4e82-b296-76a49bc59876" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:50:28 compute-0 nova_compute[186176]: 2026-02-16 17:50:28.302 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-96b40564-34c8-4e82-b296-76a49bc59876" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:50:28 compute-0 nova_compute[186176]: 2026-02-16 17:50:28.302 186180 DEBUG nova.network.neutron [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:50:29 compute-0 podman[195505]: time="2026-02-16T17:50:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:50:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:50:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:50:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:50:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2643 "" "Go-http-client/1.1"
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.005 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.430 186180 DEBUG nova.network.neutron [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Updating instance_info_cache with network_info: [{"id": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "address": "fa:16:3e:6a:ce:18", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2e5b5a7-0e", "ovs_interfaceid": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.452 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-96b40564-34c8-4e82-b296-76a49bc59876" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.454 186180 DEBUG nova.virt.libvirt.driver [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn30ckiub',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='96b40564-34c8-4e82-b296-76a49bc59876',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.455 186180 DEBUG nova.virt.libvirt.driver [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Creating instance directory: /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.456 186180 DEBUG nova.virt.libvirt.driver [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Creating disk.info with the contents: {'/var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk': 'qcow2', '/var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.457 186180 DEBUG nova.virt.libvirt.driver [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.458 186180 DEBUG nova.objects.instance [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 96b40564-34c8-4e82-b296-76a49bc59876 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.512 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.562 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.563 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.564 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.587 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.636 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.638 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.670 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.672 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.672 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.728 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.730 186180 DEBUG nova.virt.disk.api [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Checking if we can resize image /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.731 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.813 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.815 186180 DEBUG nova.virt.disk.api [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Cannot resize image /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.816 186180 DEBUG nova.objects.instance [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid 96b40564-34c8-4e82-b296-76a49bc59876 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.831 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.863 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk.config 485376" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.866 186180 DEBUG nova.virt.libvirt.volume.remotefs [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Copying file compute-1.ctlplane.example.com:/var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk.config to /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103
Feb 16 17:50:30 compute-0 nova_compute[186176]: 2026-02-16 17:50:30.866 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk.config /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.349 186180 DEBUG oslo_concurrency.processutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "scp -C -r compute-1.ctlplane.example.com:/var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876/disk.config /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.350 186180 DEBUG nova.virt.libvirt.driver [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.351 186180 DEBUG nova.virt.libvirt.vif [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:49:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1596844720',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1596844720',id=26,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:49:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d0f8251f0a9a482d879e8298c02a9652',ramdisk_id='',reservation_id='r-45094i9g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-485487738',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-485487738-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:49:39Z,user_data=None,user_id='aace4ef5f521473ca481eaa58a289951',uuid=96b40564-34c8-4e82-b296-76a49bc59876,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "address": "fa:16:3e:6a:ce:18", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf2e5b5a7-0e", "ovs_interfaceid": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.351 186180 DEBUG nova.network.os_vif_util [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "address": "fa:16:3e:6a:ce:18", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf2e5b5a7-0e", "ovs_interfaceid": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.352 186180 DEBUG nova.network.os_vif_util [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:ce:18,bridge_name='br-int',has_traffic_filtering=True,id=f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2e5b5a7-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.352 186180 DEBUG os_vif [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:ce:18,bridge_name='br-int',has_traffic_filtering=True,id=f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2e5b5a7-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.353 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.353 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.354 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.357 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.358 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2e5b5a7-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.358 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf2e5b5a7-0e, col_values=(('external_ids', {'iface-id': 'f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:ce:18', 'vm-uuid': '96b40564-34c8-4e82-b296-76a49bc59876'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.407 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:31 compute-0 NetworkManager[56463]: <info>  [1771264231.4090] manager: (tapf2e5b5a7-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.410 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.417 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.418 186180 INFO os_vif [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:ce:18,bridge_name='br-int',has_traffic_filtering=True,id=f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2e5b5a7-0e')
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.418 186180 DEBUG nova.virt.libvirt.driver [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 16 17:50:31 compute-0 openstack_network_exporter[198360]: ERROR   17:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:50:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:50:31 compute-0 nova_compute[186176]: 2026-02-16 17:50:31.418 186180 DEBUG nova.compute.manager [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn30ckiub',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='96b40564-34c8-4e82-b296-76a49bc59876',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 16 17:50:31 compute-0 openstack_network_exporter[198360]: ERROR   17:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:50:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:50:32 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:32.453 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:50:32 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:32.455 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:50:32 compute-0 nova_compute[186176]: 2026-02-16 17:50:32.493 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:33 compute-0 nova_compute[186176]: 2026-02-16 17:50:33.273 186180 DEBUG nova.network.neutron [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Port f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 16 17:50:33 compute-0 nova_compute[186176]: 2026-02-16 17:50:33.275 186180 DEBUG nova.compute.manager [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpn30ckiub',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='96b40564-34c8-4e82-b296-76a49bc59876',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 16 17:50:33 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 16 17:50:33 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 16 17:50:33 compute-0 kernel: tapf2e5b5a7-0e: entered promiscuous mode
Feb 16 17:50:33 compute-0 NetworkManager[56463]: <info>  [1771264233.6233] manager: (tapf2e5b5a7-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Feb 16 17:50:33 compute-0 systemd-udevd[215637]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:50:33 compute-0 ovn_controller[96437]: 2026-02-16T17:50:33Z|00202|binding|INFO|Claiming lport f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 for this additional chassis.
Feb 16 17:50:33 compute-0 ovn_controller[96437]: 2026-02-16T17:50:33Z|00203|binding|INFO|f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14: Claiming fa:16:3e:6a:ce:18 10.100.0.9
Feb 16 17:50:33 compute-0 nova_compute[186176]: 2026-02-16 17:50:33.667 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:33 compute-0 ovn_controller[96437]: 2026-02-16T17:50:33Z|00204|binding|INFO|Setting lport f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 ovn-installed in OVS
Feb 16 17:50:33 compute-0 nova_compute[186176]: 2026-02-16 17:50:33.677 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:33 compute-0 NetworkManager[56463]: <info>  [1771264233.6839] device (tapf2e5b5a7-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:50:33 compute-0 NetworkManager[56463]: <info>  [1771264233.6869] device (tapf2e5b5a7-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:50:33 compute-0 systemd-machined[155631]: New machine qemu-20-instance-0000001a.
Feb 16 17:50:33 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-0000001a.
Feb 16 17:50:35 compute-0 nova_compute[186176]: 2026-02-16 17:50:35.006 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:35 compute-0 nova_compute[186176]: 2026-02-16 17:50:35.122 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264235.1220086, 96b40564-34c8-4e82-b296-76a49bc59876 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:50:35 compute-0 nova_compute[186176]: 2026-02-16 17:50:35.122 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] VM Started (Lifecycle Event)
Feb 16 17:50:35 compute-0 nova_compute[186176]: 2026-02-16 17:50:35.141 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:50:35 compute-0 nova_compute[186176]: 2026-02-16 17:50:35.821 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264235.8210292, 96b40564-34c8-4e82-b296-76a49bc59876 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:50:35 compute-0 nova_compute[186176]: 2026-02-16 17:50:35.823 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] VM Resumed (Lifecycle Event)
Feb 16 17:50:35 compute-0 nova_compute[186176]: 2026-02-16 17:50:35.852 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:50:35 compute-0 nova_compute[186176]: 2026-02-16 17:50:35.857 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:50:35 compute-0 nova_compute[186176]: 2026-02-16 17:50:35.878 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-0.ctlplane.example.com
Feb 16 17:50:36 compute-0 nova_compute[186176]: 2026-02-16 17:50:36.409 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:36 compute-0 ovn_controller[96437]: 2026-02-16T17:50:36Z|00205|binding|INFO|Claiming lport f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 for this chassis.
Feb 16 17:50:36 compute-0 ovn_controller[96437]: 2026-02-16T17:50:36Z|00206|binding|INFO|f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14: Claiming fa:16:3e:6a:ce:18 10.100.0.9
Feb 16 17:50:36 compute-0 ovn_controller[96437]: 2026-02-16T17:50:36Z|00207|binding|INFO|Setting lport f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 up in Southbound
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.752 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:ce:18 10.100.0.9'], port_security=['fa:16:3e:6a:ce:18 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '96b40564-34c8-4e82-b296-76a49bc59876', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0f8251f0a9a482d879e8298c02a9652', 'neutron:revision_number': '11', 'neutron:security_group_ids': '5f95a2b3-a7bc-49f8-9945-a529603420cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=092b2aac-8232-424c-92e7-57054cbc7fed, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.754 105730 INFO neutron.agent.ovn.metadata.agent [-] Port f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 in datapath 9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e bound to our chassis
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.757 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.777 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[1d059cf1-aef1-47f3-8bf6-49831ff42c7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.816 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a911a5-9403-445d-91bd-f29d77200d78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.821 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d54815-cc13-4a26-9e3b-d41fce5d2972]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.858 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[70ac61af-b57a-4036-891a-8b500e0315ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:36 compute-0 nova_compute[186176]: 2026-02-16 17:50:36.873 186180 INFO nova.compute.manager [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Post operation of migration started
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.881 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[615601c9-68b6-4638-8e68-0f87b532b822]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9fcb1bf9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:7d:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575419, 'reachable_time': 42651, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215675, 'error': None, 'target': 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.899 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[aca4e143-fd3d-4edb-ad86-5645a7f22e0b]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9fcb1bf9-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 575429, 'tstamp': 575429}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215676, 'error': None, 'target': 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9fcb1bf9-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 575432, 'tstamp': 575432}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215676, 'error': None, 'target': 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.901 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fcb1bf9-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:36 compute-0 nova_compute[186176]: 2026-02-16 17:50:36.904 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:36 compute-0 nova_compute[186176]: 2026-02-16 17:50:36.905 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.906 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9fcb1bf9-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.907 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.908 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9fcb1bf9-d0, col_values=(('external_ids', {'iface-id': '613be906-2a79-4164-80b2-078ce66608ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:36 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:36.908 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:50:37 compute-0 nova_compute[186176]: 2026-02-16 17:50:37.161 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-96b40564-34c8-4e82-b296-76a49bc59876" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:50:37 compute-0 nova_compute[186176]: 2026-02-16 17:50:37.162 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-96b40564-34c8-4e82-b296-76a49bc59876" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:50:37 compute-0 nova_compute[186176]: 2026-02-16 17:50:37.162 186180 DEBUG nova.network.neutron [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:50:38 compute-0 nova_compute[186176]: 2026-02-16 17:50:38.053 186180 DEBUG nova.network.neutron [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Updating instance_info_cache with network_info: [{"id": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "address": "fa:16:3e:6a:ce:18", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2e5b5a7-0e", "ovs_interfaceid": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:50:38 compute-0 nova_compute[186176]: 2026-02-16 17:50:38.068 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-96b40564-34c8-4e82-b296-76a49bc59876" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:50:38 compute-0 nova_compute[186176]: 2026-02-16 17:50:38.084 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:38 compute-0 nova_compute[186176]: 2026-02-16 17:50:38.085 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:38 compute-0 nova_compute[186176]: 2026-02-16 17:50:38.085 186180 DEBUG oslo_concurrency.lockutils [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:38 compute-0 nova_compute[186176]: 2026-02-16 17:50:38.091 186180 INFO nova.virt.libvirt.driver [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 16 17:50:38 compute-0 virtqemud[185389]: Domain id=20 name='instance-0000001a' uuid=96b40564-34c8-4e82-b296-76a49bc59876 is tainted: custom-monitor
Feb 16 17:50:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:38.184 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:38.186 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:38.186 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:39 compute-0 nova_compute[186176]: 2026-02-16 17:50:39.098 186180 INFO nova.virt.libvirt.driver [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 16 17:50:40 compute-0 nova_compute[186176]: 2026-02-16 17:50:40.009 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:40 compute-0 podman[215677]: 2026-02-16 17:50:40.094919111 +0000 UTC m=+0.065088392 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 16 17:50:40 compute-0 nova_compute[186176]: 2026-02-16 17:50:40.105 186180 INFO nova.virt.libvirt.driver [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 16 17:50:40 compute-0 nova_compute[186176]: 2026-02-16 17:50:40.111 186180 DEBUG nova.compute.manager [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:50:40 compute-0 nova_compute[186176]: 2026-02-16 17:50:40.137 186180 DEBUG nova.objects.instance [None req-01d7ae40-fe46-489f-8910-99ba347f938a b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 16 17:50:40 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:40.457 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:41 compute-0 nova_compute[186176]: 2026-02-16 17:50:41.412 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:43 compute-0 podman[215699]: 2026-02-16 17:50:43.124582278 +0000 UTC m=+0.093546183 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.646 186180 DEBUG oslo_concurrency.lockutils [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "96b40564-34c8-4e82-b296-76a49bc59876" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.647 186180 DEBUG oslo_concurrency.lockutils [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "96b40564-34c8-4e82-b296-76a49bc59876" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.647 186180 DEBUG oslo_concurrency.lockutils [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "96b40564-34c8-4e82-b296-76a49bc59876-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.648 186180 DEBUG oslo_concurrency.lockutils [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "96b40564-34c8-4e82-b296-76a49bc59876-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.648 186180 DEBUG oslo_concurrency.lockutils [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "96b40564-34c8-4e82-b296-76a49bc59876-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.650 186180 INFO nova.compute.manager [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Terminating instance
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.651 186180 DEBUG nova.compute.manager [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:50:44 compute-0 kernel: tapf2e5b5a7-0e (unregistering): left promiscuous mode
Feb 16 17:50:44 compute-0 NetworkManager[56463]: <info>  [1771264244.6806] device (tapf2e5b5a7-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.686 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:44 compute-0 ovn_controller[96437]: 2026-02-16T17:50:44Z|00208|binding|INFO|Releasing lport f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 from this chassis (sb_readonly=0)
Feb 16 17:50:44 compute-0 ovn_controller[96437]: 2026-02-16T17:50:44Z|00209|binding|INFO|Setting lport f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 down in Southbound
Feb 16 17:50:44 compute-0 ovn_controller[96437]: 2026-02-16T17:50:44Z|00210|binding|INFO|Removing iface tapf2e5b5a7-0e ovn-installed in OVS
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.694 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:ce:18 10.100.0.9'], port_security=['fa:16:3e:6a:ce:18 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '96b40564-34c8-4e82-b296-76a49bc59876', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0f8251f0a9a482d879e8298c02a9652', 'neutron:revision_number': '13', 'neutron:security_group_ids': '5f95a2b3-a7bc-49f8-9945-a529603420cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=092b2aac-8232-424c-92e7-57054cbc7fed, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.695 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.696 105730 INFO neutron.agent.ovn.metadata.agent [-] Port f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 in datapath 9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e unbound from our chassis
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.698 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.717 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[0b52f6ef-4b9b-45cf-9273-017527566a81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:44 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Feb 16 17:50:44 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000001a.scope: Consumed 2.254s CPU time.
Feb 16 17:50:44 compute-0 systemd-machined[155631]: Machine qemu-20-instance-0000001a terminated.
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.742 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[5b6c5a3c-bcfe-43f2-89b0-36cb5c425d52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.745 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[ed37febf-f81e-4c7e-a249-6aa7f2d4e077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.765 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[7e304e91-a372-432e-956b-21c52d3ddc57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.785 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[7093d061-85f6-43b4-97e5-bb000586f22f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9fcb1bf9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ca:7d:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575419, 'reachable_time': 42651, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215731, 'error': None, 'target': 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.802 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[34c3f026-4e8d-425a-b2c0-443a54f4938f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9fcb1bf9-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 575429, 'tstamp': 575429}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215732, 'error': None, 'target': 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9fcb1bf9-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 575432, 'tstamp': 575432}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215732, 'error': None, 'target': 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.805 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fcb1bf9-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.807 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.810 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.811 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9fcb1bf9-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.812 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.813 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9fcb1bf9-d0, col_values=(('external_ids', {'iface-id': '613be906-2a79-4164-80b2-078ce66608ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:44.813 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.878 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.882 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.935 186180 INFO nova.virt.libvirt.driver [-] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Instance destroyed successfully.
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.936 186180 DEBUG nova.objects.instance [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lazy-loading 'resources' on Instance uuid 96b40564-34c8-4e82-b296-76a49bc59876 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.951 186180 DEBUG nova.virt.libvirt.vif [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-16T17:49:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-1596844720',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-1596844720',id=26,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:49:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d0f8251f0a9a482d879e8298c02a9652',ramdisk_id='',reservation_id='r-45094i9g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-485487738',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-485487738-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:50:40Z,user_data=None,user_id='aace4ef5f521473ca481eaa58a289951',uuid=96b40564-34c8-4e82-b296-76a49bc59876,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "address": "fa:16:3e:6a:ce:18", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2e5b5a7-0e", "ovs_interfaceid": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.951 186180 DEBUG nova.network.os_vif_util [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Converting VIF {"id": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "address": "fa:16:3e:6a:ce:18", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf2e5b5a7-0e", "ovs_interfaceid": "f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.952 186180 DEBUG nova.network.os_vif_util [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:ce:18,bridge_name='br-int',has_traffic_filtering=True,id=f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2e5b5a7-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.952 186180 DEBUG os_vif [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:ce:18,bridge_name='br-int',has_traffic_filtering=True,id=f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2e5b5a7-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.953 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.954 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2e5b5a7-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.955 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.957 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.959 186180 INFO os_vif [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:ce:18,bridge_name='br-int',has_traffic_filtering=True,id=f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf2e5b5a7-0e')
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.959 186180 INFO nova.virt.libvirt.driver [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Deleting instance files /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876_del
Feb 16 17:50:44 compute-0 nova_compute[186176]: 2026-02-16 17:50:44.960 186180 INFO nova.virt.libvirt.driver [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Deletion of /var/lib/nova/instances/96b40564-34c8-4e82-b296-76a49bc59876_del complete
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.012 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.019 186180 INFO nova.compute.manager [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Took 0.37 seconds to destroy the instance on the hypervisor.
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.019 186180 DEBUG oslo.service.loopingcall [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.020 186180 DEBUG nova.compute.manager [-] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.020 186180 DEBUG nova.network.neutron [-] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.226 186180 DEBUG nova.compute.manager [req-48e013a0-8818-406c-b4e4-230e3e54c6ae req-010079b4-5d9c-4899-bd7a-efc436d03b3f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Received event network-vif-unplugged-f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.226 186180 DEBUG oslo_concurrency.lockutils [req-48e013a0-8818-406c-b4e4-230e3e54c6ae req-010079b4-5d9c-4899-bd7a-efc436d03b3f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "96b40564-34c8-4e82-b296-76a49bc59876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.226 186180 DEBUG oslo_concurrency.lockutils [req-48e013a0-8818-406c-b4e4-230e3e54c6ae req-010079b4-5d9c-4899-bd7a-efc436d03b3f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96b40564-34c8-4e82-b296-76a49bc59876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.227 186180 DEBUG oslo_concurrency.lockutils [req-48e013a0-8818-406c-b4e4-230e3e54c6ae req-010079b4-5d9c-4899-bd7a-efc436d03b3f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96b40564-34c8-4e82-b296-76a49bc59876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.227 186180 DEBUG nova.compute.manager [req-48e013a0-8818-406c-b4e4-230e3e54c6ae req-010079b4-5d9c-4899-bd7a-efc436d03b3f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] No waiting events found dispatching network-vif-unplugged-f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.227 186180 DEBUG nova.compute.manager [req-48e013a0-8818-406c-b4e4-230e3e54c6ae req-010079b4-5d9c-4899-bd7a-efc436d03b3f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Received event network-vif-unplugged-f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.630 186180 DEBUG nova.network.neutron [-] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.652 186180 INFO nova.compute.manager [-] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Took 0.63 seconds to deallocate network for instance.
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.682 186180 DEBUG nova.compute.manager [req-d7712887-35e6-4a6a-adca-4e6b6bcb6fcb req-1113910c-82e8-47b0-8611-5b5f400b58f8 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Received event network-vif-deleted-f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.695 186180 DEBUG oslo_concurrency.lockutils [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.696 186180 DEBUG oslo_concurrency.lockutils [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.702 186180 DEBUG oslo_concurrency.lockutils [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.732 186180 INFO nova.scheduler.client.report [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Deleted allocations for instance 96b40564-34c8-4e82-b296-76a49bc59876
Feb 16 17:50:45 compute-0 nova_compute[186176]: 2026-02-16 17:50:45.786 186180 DEBUG oslo_concurrency.lockutils [None req-9e142f2a-0885-49cf-a121-e26cec9da5c7 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "96b40564-34c8-4e82-b296-76a49bc59876" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.599 186180 DEBUG oslo_concurrency.lockutils [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.599 186180 DEBUG oslo_concurrency.lockutils [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.600 186180 DEBUG oslo_concurrency.lockutils [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.600 186180 DEBUG oslo_concurrency.lockutils [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.600 186180 DEBUG oslo_concurrency.lockutils [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.601 186180 INFO nova.compute.manager [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Terminating instance
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.602 186180 DEBUG nova.compute.manager [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 16 17:50:46 compute-0 kernel: tap322cec24-ae (unregistering): left promiscuous mode
Feb 16 17:50:46 compute-0 NetworkManager[56463]: <info>  [1771264246.6221] device (tap322cec24-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.622 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:46 compute-0 ovn_controller[96437]: 2026-02-16T17:50:46Z|00211|binding|INFO|Releasing lport 322cec24-ae8f-4853-aca6-9609258d7523 from this chassis (sb_readonly=0)
Feb 16 17:50:46 compute-0 ovn_controller[96437]: 2026-02-16T17:50:46Z|00212|binding|INFO|Setting lport 322cec24-ae8f-4853-aca6-9609258d7523 down in Southbound
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.628 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:46 compute-0 ovn_controller[96437]: 2026-02-16T17:50:46Z|00213|binding|INFO|Removing iface tap322cec24-ae ovn-installed in OVS
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.631 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.635 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:46.643 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:88:d5:1d 10.100.0.14'], port_security=['fa:16:3e:88:d5:1d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ad140b46-c259-4541-b51e-ad0fcc4c26d4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0f8251f0a9a482d879e8298c02a9652', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f95a2b3-a7bc-49f8-9945-a529603420cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=092b2aac-8232-424c-92e7-57054cbc7fed, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=322cec24-ae8f-4853-aca6-9609258d7523) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:50:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:46.645 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 322cec24-ae8f-4853-aca6-9609258d7523 in datapath 9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e unbound from our chassis
Feb 16 17:50:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:46.647 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:50:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:46.649 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[60a5ae29-c483-41ec-a24d-b57bd0448a54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:46 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:46.649 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e namespace which is not needed anymore
Feb 16 17:50:46 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Deactivated successfully.
Feb 16 17:50:46 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Consumed 14.887s CPU time.
Feb 16 17:50:46 compute-0 systemd-machined[155631]: Machine qemu-19-instance-00000019 terminated.
Feb 16 17:50:46 compute-0 neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e[215354]: [NOTICE]   (215358) : haproxy version is 2.8.14-c23fe91
Feb 16 17:50:46 compute-0 neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e[215354]: [NOTICE]   (215358) : path to executable is /usr/sbin/haproxy
Feb 16 17:50:46 compute-0 neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e[215354]: [WARNING]  (215358) : Exiting Master process...
Feb 16 17:50:46 compute-0 neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e[215354]: [WARNING]  (215358) : Exiting Master process...
Feb 16 17:50:46 compute-0 neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e[215354]: [ALERT]    (215358) : Current worker (215360) exited with code 143 (Terminated)
Feb 16 17:50:46 compute-0 neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e[215354]: [WARNING]  (215358) : All workers exited. Exiting... (0)
Feb 16 17:50:46 compute-0 systemd[1]: libpod-292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b.scope: Deactivated successfully.
Feb 16 17:50:46 compute-0 podman[215771]: 2026-02-16 17:50:46.843744971 +0000 UTC m=+0.070374083 container died 292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.872 186180 INFO nova.virt.libvirt.driver [-] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Instance destroyed successfully.
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.872 186180 DEBUG nova.objects.instance [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lazy-loading 'resources' on Instance uuid ad140b46-c259-4541-b51e-ad0fcc4c26d4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:50:46 compute-0 systemd[1]: var-lib-containers-storage-overlay-90ee2e15ed80bec4af51bf43ec9fa4866b1c204cbd204f0df5ad6b7450039af9-merged.mount: Deactivated successfully.
Feb 16 17:50:46 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b-userdata-shm.mount: Deactivated successfully.
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.891 186180 DEBUG nova.virt.libvirt.vif [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:49:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-666654609',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-666654609',id=25,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:49:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d0f8251f0a9a482d879e8298c02a9652',ramdisk_id='',reservation_id='r-w0i03c7g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-485487738',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-485487738-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:49:19Z,user_data=None,user_id='aace4ef5f521473ca481eaa58a289951',uuid=ad140b46-c259-4541-b51e-ad0fcc4c26d4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.892 186180 DEBUG nova.network.os_vif_util [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Converting VIF {"id": "322cec24-ae8f-4853-aca6-9609258d7523", "address": "fa:16:3e:88:d5:1d", "network": {"id": "9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-202660006-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0f8251f0a9a482d879e8298c02a9652", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap322cec24-ae", "ovs_interfaceid": "322cec24-ae8f-4853-aca6-9609258d7523", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.893 186180 DEBUG nova.network.os_vif_util [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:88:d5:1d,bridge_name='br-int',has_traffic_filtering=True,id=322cec24-ae8f-4853-aca6-9609258d7523,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap322cec24-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.893 186180 DEBUG os_vif [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:d5:1d,bridge_name='br-int',has_traffic_filtering=True,id=322cec24-ae8f-4853-aca6-9609258d7523,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap322cec24-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.894 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.894 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap322cec24-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:46 compute-0 podman[215771]: 2026-02-16 17:50:46.894730686 +0000 UTC m=+0.121359828 container cleanup 292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.934 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.936 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.938 186180 INFO os_vif [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:88:d5:1d,bridge_name='br-int',has_traffic_filtering=True,id=322cec24-ae8f-4853-aca6-9609258d7523,network=Network(9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap322cec24-ae')
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.938 186180 INFO nova.virt.libvirt.driver [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Deleting instance files /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4_del
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.939 186180 INFO nova.virt.libvirt.driver [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Deletion of /var/lib/nova/instances/ad140b46-c259-4541-b51e-ad0fcc4c26d4_del complete
Feb 16 17:50:46 compute-0 systemd[1]: libpod-conmon-292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b.scope: Deactivated successfully.
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.995 186180 INFO nova.compute.manager [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Took 0.39 seconds to destroy the instance on the hypervisor.
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.996 186180 DEBUG oslo.service.loopingcall [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.996 186180 DEBUG nova.compute.manager [-] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 16 17:50:46 compute-0 nova_compute[186176]: 2026-02-16 17:50:46.996 186180 DEBUG nova.network.neutron [-] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 16 17:50:47 compute-0 podman[215816]: 2026-02-16 17:50:47.012456063 +0000 UTC m=+0.051082678 container remove 292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 16 17:50:47 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:47.017 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[366e9b13-c319-4673-9f0e-5f7d06712ec3]: (4, ('Mon Feb 16 05:50:46 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e (292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b)\n292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b\nMon Feb 16 05:50:46 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e (292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b)\n292c7d52935f314732aed031cf0372db512d7fc184bec93f4a2124b5e314877b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:47 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:47.019 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[02b9153c-78a5-4d7e-bb4e-a3e3250b8aa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:47 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:47.020 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fcb1bf9-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.022 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:47 compute-0 kernel: tap9fcb1bf9-d0: left promiscuous mode
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.029 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.031 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:47 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:47.034 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[c3814e3b-f9ca-4eb0-ae65-9379c95b2e02]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:47 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:47.052 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[03d02e90-bb16-44bf-8ea9-fa0129df3ea5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:47 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:47.054 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b3dc2098-4cb3-40a8-bf7d-d3a1079bf3f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:47 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:47.074 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[74a97d23-984f-4075-bc9b-d6988c40803a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575413, 'reachable_time': 25238, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215831, 'error': None, 'target': 'ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:47 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:47.076 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9fcb1bf9-d26f-4bc9-93e2-8a28d0cb767e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:50:47 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:50:47.077 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[47e25d9e-98d4-4d3b-ace2-675f45a0b4e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:50:47 compute-0 systemd[1]: run-netns-ovnmeta\x2d9fcb1bf9\x2dd26f\x2d4bc9\x2d93e2\x2d8a28d0cb767e.mount: Deactivated successfully.
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.313 186180 DEBUG nova.compute.manager [req-e1b05233-33ee-4bd2-8473-c7ebd9c48de6 req-e244fc1e-fc10-4b78-99a5-12f84fed4e1c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Received event network-vif-plugged-f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.313 186180 DEBUG oslo_concurrency.lockutils [req-e1b05233-33ee-4bd2-8473-c7ebd9c48de6 req-e244fc1e-fc10-4b78-99a5-12f84fed4e1c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "96b40564-34c8-4e82-b296-76a49bc59876-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.314 186180 DEBUG oslo_concurrency.lockutils [req-e1b05233-33ee-4bd2-8473-c7ebd9c48de6 req-e244fc1e-fc10-4b78-99a5-12f84fed4e1c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96b40564-34c8-4e82-b296-76a49bc59876-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.314 186180 DEBUG oslo_concurrency.lockutils [req-e1b05233-33ee-4bd2-8473-c7ebd9c48de6 req-e244fc1e-fc10-4b78-99a5-12f84fed4e1c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "96b40564-34c8-4e82-b296-76a49bc59876-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.315 186180 DEBUG nova.compute.manager [req-e1b05233-33ee-4bd2-8473-c7ebd9c48de6 req-e244fc1e-fc10-4b78-99a5-12f84fed4e1c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] No waiting events found dispatching network-vif-plugged-f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.315 186180 WARNING nova.compute.manager [req-e1b05233-33ee-4bd2-8473-c7ebd9c48de6 req-e244fc1e-fc10-4b78-99a5-12f84fed4e1c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Received unexpected event network-vif-plugged-f2e5b5a7-0e9e-4ecd-919d-a132cfff0f14 for instance with vm_state deleted and task_state None.
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.845 186180 DEBUG nova.compute.manager [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Received event network-vif-unplugged-322cec24-ae8f-4853-aca6-9609258d7523 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.845 186180 DEBUG oslo_concurrency.lockutils [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.846 186180 DEBUG oslo_concurrency.lockutils [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.846 186180 DEBUG oslo_concurrency.lockutils [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.847 186180 DEBUG nova.compute.manager [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] No waiting events found dispatching network-vif-unplugged-322cec24-ae8f-4853-aca6-9609258d7523 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.847 186180 DEBUG nova.compute.manager [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Received event network-vif-unplugged-322cec24-ae8f-4853-aca6-9609258d7523 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.847 186180 DEBUG nova.compute.manager [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Received event network-vif-plugged-322cec24-ae8f-4853-aca6-9609258d7523 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.848 186180 DEBUG oslo_concurrency.lockutils [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.848 186180 DEBUG oslo_concurrency.lockutils [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.848 186180 DEBUG oslo_concurrency.lockutils [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.849 186180 DEBUG nova.compute.manager [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] No waiting events found dispatching network-vif-plugged-322cec24-ae8f-4853-aca6-9609258d7523 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:50:47 compute-0 nova_compute[186176]: 2026-02-16 17:50:47.849 186180 WARNING nova.compute.manager [req-b296fcad-6fa1-450b-90b4-04d1385a0860 req-e4c380fa-9aa2-4ca9-9f09-90690ad34985 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Received unexpected event network-vif-plugged-322cec24-ae8f-4853-aca6-9609258d7523 for instance with vm_state active and task_state deleting.
Feb 16 17:50:48 compute-0 nova_compute[186176]: 2026-02-16 17:50:48.037 186180 DEBUG nova.network.neutron [-] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:50:48 compute-0 nova_compute[186176]: 2026-02-16 17:50:48.062 186180 INFO nova.compute.manager [-] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Took 1.07 seconds to deallocate network for instance.
Feb 16 17:50:48 compute-0 nova_compute[186176]: 2026-02-16 17:50:48.115 186180 DEBUG oslo_concurrency.lockutils [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:50:48 compute-0 nova_compute[186176]: 2026-02-16 17:50:48.117 186180 DEBUG oslo_concurrency.lockutils [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:50:48 compute-0 nova_compute[186176]: 2026-02-16 17:50:48.168 186180 DEBUG nova.compute.provider_tree [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:50:48 compute-0 nova_compute[186176]: 2026-02-16 17:50:48.185 186180 DEBUG nova.scheduler.client.report [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:50:48 compute-0 nova_compute[186176]: 2026-02-16 17:50:48.210 186180 DEBUG oslo_concurrency.lockutils [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:48 compute-0 nova_compute[186176]: 2026-02-16 17:50:48.240 186180 INFO nova.scheduler.client.report [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Deleted allocations for instance ad140b46-c259-4541-b51e-ad0fcc4c26d4
Feb 16 17:50:48 compute-0 nova_compute[186176]: 2026-02-16 17:50:48.309 186180 DEBUG oslo_concurrency.lockutils [None req-c8894ab5-fe3e-4404-bec8-2f44c984a1a1 aace4ef5f521473ca481eaa58a289951 d0f8251f0a9a482d879e8298c02a9652 - - default default] Lock "ad140b46-c259-4541-b51e-ad0fcc4c26d4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:50:49 compute-0 podman[215833]: 2026-02-16 17:50:49.113436075 +0000 UTC m=+0.073248683 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:50:49 compute-0 podman[215832]: 2026-02-16 17:50:49.147706149 +0000 UTC m=+0.112467689 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 16 17:50:49 compute-0 nova_compute[186176]: 2026-02-16 17:50:49.956 186180 DEBUG nova.compute.manager [req-92a4f800-bd4c-45ff-8aa7-defdd9df2b3e req-4723b715-c7a1-4f31-bb18-5f798ddcb989 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Received event network-vif-deleted-322cec24-ae8f-4853-aca6-9609258d7523 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:50:50 compute-0 nova_compute[186176]: 2026-02-16 17:50:50.013 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:51 compute-0 nova_compute[186176]: 2026-02-16 17:50:51.976 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:54 compute-0 nova_compute[186176]: 2026-02-16 17:50:54.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:50:54 compute-0 nova_compute[186176]: 2026-02-16 17:50:54.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:50:54 compute-0 nova_compute[186176]: 2026-02-16 17:50:54.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:50:54 compute-0 nova_compute[186176]: 2026-02-16 17:50:54.331 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:50:55 compute-0 nova_compute[186176]: 2026-02-16 17:50:55.016 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:55 compute-0 nova_compute[186176]: 2026-02-16 17:50:55.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:50:56 compute-0 nova_compute[186176]: 2026-02-16 17:50:56.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:50:56 compute-0 nova_compute[186176]: 2026-02-16 17:50:56.327 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:50:56 compute-0 nova_compute[186176]: 2026-02-16 17:50:56.328 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:50:56 compute-0 nova_compute[186176]: 2026-02-16 17:50:56.980 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:50:57 compute-0 nova_compute[186176]: 2026-02-16 17:50:57.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:50:59 compute-0 podman[195505]: time="2026-02-16T17:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:50:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:50:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 17:50:59 compute-0 nova_compute[186176]: 2026-02-16 17:50:59.934 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771264244.9315586, 96b40564-34c8-4e82-b296-76a49bc59876 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:50:59 compute-0 nova_compute[186176]: 2026-02-16 17:50:59.935 186180 INFO nova.compute.manager [-] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] VM Stopped (Lifecycle Event)
Feb 16 17:50:59 compute-0 nova_compute[186176]: 2026-02-16 17:50:59.953 186180 DEBUG nova.compute.manager [None req-a76141cd-fd58-4bff-a74a-5c292bef44e1 - - - - - -] [instance: 96b40564-34c8-4e82-b296-76a49bc59876] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.018 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.343 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.344 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.344 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.564 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.566 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5807MB free_disk=73.22287368774414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.566 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.566 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.646 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.647 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.668 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.681 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.701 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:51:00 compute-0 nova_compute[186176]: 2026-02-16 17:51:00.701 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:51:01 compute-0 openstack_network_exporter[198360]: ERROR   17:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:51:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:51:01 compute-0 openstack_network_exporter[198360]: ERROR   17:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:51:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:51:01 compute-0 nova_compute[186176]: 2026-02-16 17:51:01.871 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771264246.8700147, ad140b46-c259-4541-b51e-ad0fcc4c26d4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:51:01 compute-0 nova_compute[186176]: 2026-02-16 17:51:01.872 186180 INFO nova.compute.manager [-] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] VM Stopped (Lifecycle Event)
Feb 16 17:51:01 compute-0 nova_compute[186176]: 2026-02-16 17:51:01.890 186180 DEBUG nova.compute.manager [None req-cc2f2945-b162-4e18-9732-0b7def82f323 - - - - - -] [instance: ad140b46-c259-4541-b51e-ad0fcc4c26d4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:51:01 compute-0 nova_compute[186176]: 2026-02-16 17:51:01.982 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:02 compute-0 nova_compute[186176]: 2026-02-16 17:51:02.702 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:51:05 compute-0 nova_compute[186176]: 2026-02-16 17:51:05.020 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:05 compute-0 nova_compute[186176]: 2026-02-16 17:51:05.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:51:06 compute-0 nova_compute[186176]: 2026-02-16 17:51:06.985 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:10 compute-0 nova_compute[186176]: 2026-02-16 17:51:10.021 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:10 compute-0 nova_compute[186176]: 2026-02-16 17:51:10.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:51:11 compute-0 podman[215883]: 2026-02-16 17:51:11.13825453 +0000 UTC m=+0.108554282 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Feb 16 17:51:11 compute-0 nova_compute[186176]: 2026-02-16 17:51:11.989 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:14 compute-0 podman[215905]: 2026-02-16 17:51:14.104478155 +0000 UTC m=+0.070729522 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:51:15 compute-0 nova_compute[186176]: 2026-02-16 17:51:15.025 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:16 compute-0 nova_compute[186176]: 2026-02-16 17:51:16.993 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:17 compute-0 ovn_controller[96437]: 2026-02-16T17:51:17Z|00214|memory_trim|INFO|Detected inactivity (last active 30017 ms ago): trimming memory
Feb 16 17:51:20 compute-0 nova_compute[186176]: 2026-02-16 17:51:20.025 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:20 compute-0 podman[215926]: 2026-02-16 17:51:20.123868835 +0000 UTC m=+0.083582898 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:51:20 compute-0 podman[215925]: 2026-02-16 17:51:20.190237008 +0000 UTC m=+0.155692462 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 17:51:21 compute-0 nova_compute[186176]: 2026-02-16 17:51:21.995 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:25 compute-0 nova_compute[186176]: 2026-02-16 17:51:25.028 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:26 compute-0 nova_compute[186176]: 2026-02-16 17:51:26.569 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:26 compute-0 nova_compute[186176]: 2026-02-16 17:51:26.997 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:29 compute-0 podman[195505]: time="2026-02-16T17:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:51:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:51:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Feb 16 17:51:30 compute-0 nova_compute[186176]: 2026-02-16 17:51:30.030 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:31 compute-0 openstack_network_exporter[198360]: ERROR   17:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:51:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:51:31 compute-0 openstack_network_exporter[198360]: ERROR   17:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:51:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:51:32 compute-0 nova_compute[186176]: 2026-02-16 17:51:31.999 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:34 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:51:34.361 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:51:34 compute-0 nova_compute[186176]: 2026-02-16 17:51:34.362 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:34 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:51:34.364 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:51:35 compute-0 nova_compute[186176]: 2026-02-16 17:51:35.033 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:37 compute-0 nova_compute[186176]: 2026-02-16 17:51:37.002 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:51:38.185 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:51:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:51:38.186 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:51:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:51:38.186 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:51:40 compute-0 nova_compute[186176]: 2026-02-16 17:51:40.036 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:40 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:51:40.366 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:51:42 compute-0 nova_compute[186176]: 2026-02-16 17:51:42.006 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:42 compute-0 podman[215973]: 2026-02-16 17:51:42.122190905 +0000 UTC m=+0.091973244 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, architecture=x86_64)
Feb 16 17:51:45 compute-0 nova_compute[186176]: 2026-02-16 17:51:45.038 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:45 compute-0 podman[215995]: 2026-02-16 17:51:45.102328552 +0000 UTC m=+0.073373997 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 16 17:51:47 compute-0 nova_compute[186176]: 2026-02-16 17:51:47.009 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:50 compute-0 nova_compute[186176]: 2026-02-16 17:51:50.041 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:51 compute-0 podman[216014]: 2026-02-16 17:51:51.124911679 +0000 UTC m=+0.097276475 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 16 17:51:51 compute-0 podman[216015]: 2026-02-16 17:51:51.136990956 +0000 UTC m=+0.100859323 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:51:52 compute-0 nova_compute[186176]: 2026-02-16 17:51:52.012 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:55 compute-0 nova_compute[186176]: 2026-02-16 17:51:55.042 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:55 compute-0 nova_compute[186176]: 2026-02-16 17:51:55.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:51:55 compute-0 nova_compute[186176]: 2026-02-16 17:51:55.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:51:55 compute-0 nova_compute[186176]: 2026-02-16 17:51:55.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:51:55 compute-0 nova_compute[186176]: 2026-02-16 17:51:55.346 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:51:56 compute-0 nova_compute[186176]: 2026-02-16 17:51:56.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:51:57 compute-0 nova_compute[186176]: 2026-02-16 17:51:57.016 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:51:57 compute-0 nova_compute[186176]: 2026-02-16 17:51:57.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:51:57 compute-0 nova_compute[186176]: 2026-02-16 17:51:57.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:51:59 compute-0 nova_compute[186176]: 2026-02-16 17:51:59.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:51:59 compute-0 podman[195505]: time="2026-02-16T17:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:51:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:51:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Feb 16 17:52:00 compute-0 nova_compute[186176]: 2026-02-16 17:52:00.045 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.343 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.343 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:52:01 compute-0 openstack_network_exporter[198360]: ERROR   17:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:52:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:52:01 compute-0 openstack_network_exporter[198360]: ERROR   17:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:52:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.585 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.587 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5832MB free_disk=73.22378921508789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.587 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.588 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.658 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.659 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.749 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.775 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.777 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:52:01 compute-0 nova_compute[186176]: 2026-02-16 17:52:01.778 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:02 compute-0 nova_compute[186176]: 2026-02-16 17:52:02.018 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:02 compute-0 nova_compute[186176]: 2026-02-16 17:52:02.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:02 compute-0 nova_compute[186176]: 2026-02-16 17:52:02.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:02 compute-0 nova_compute[186176]: 2026-02-16 17:52:02.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 17:52:02 compute-0 nova_compute[186176]: 2026-02-16 17:52:02.332 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 17:52:02 compute-0 sshd-session[216064]: Invalid user solana from 2.57.122.210 port 58628
Feb 16 17:52:02 compute-0 sshd-session[216064]: Connection closed by invalid user solana 2.57.122.210 port 58628 [preauth]
Feb 16 17:52:03 compute-0 ovn_controller[96437]: 2026-02-16T17:52:03Z|00215|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Feb 16 17:52:04 compute-0 nova_compute[186176]: 2026-02-16 17:52:04.331 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.048 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.673 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.674 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.694 186180 DEBUG nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.753 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.754 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.763 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.763 186180 INFO nova.compute.claims [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.873 186180 DEBUG nova.compute.provider_tree [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.890 186180 DEBUG nova.scheduler.client.report [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.919 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.920 186180 DEBUG nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.965 186180 DEBUG nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.966 186180 DEBUG nova.network.neutron [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:52:05 compute-0 nova_compute[186176]: 2026-02-16 17:52:05.990 186180 INFO nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.010 186180 DEBUG nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.145 186180 DEBUG nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.147 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.148 186180 INFO nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Creating image(s)
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.149 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Acquiring lock "/var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.149 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "/var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.151 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "/var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.179 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.264 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.266 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.267 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.288 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.355 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.357 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.389 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.390 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.391 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.441 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.442 186180 DEBUG nova.virt.disk.api [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Checking if we can resize image /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.443 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.499 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.500 186180 DEBUG nova.virt.disk.api [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Cannot resize image /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.500 186180 DEBUG nova.objects.instance [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lazy-loading 'migration_context' on Instance uuid 0261deed-431b-4294-a4e7-e2a37fe3601b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.519 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.520 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Ensure instance console log exists: /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.520 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.521 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:06 compute-0 nova_compute[186176]: 2026-02-16 17:52:06.521 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:07 compute-0 nova_compute[186176]: 2026-02-16 17:52:07.022 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:07 compute-0 nova_compute[186176]: 2026-02-16 17:52:07.178 186180 DEBUG nova.policy [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8873acc2ce444a0e8eb100e6fdac8df7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd546004cb905437e918387be6c24c45b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:52:07 compute-0 nova_compute[186176]: 2026-02-16 17:52:07.463 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:09 compute-0 nova_compute[186176]: 2026-02-16 17:52:09.252 186180 DEBUG nova.network.neutron [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Successfully created port: 0653b14c-0e3a-4352-ac9c-52c00e138e2b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:52:10 compute-0 nova_compute[186176]: 2026-02-16 17:52:10.051 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:10 compute-0 nova_compute[186176]: 2026-02-16 17:52:10.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:10 compute-0 nova_compute[186176]: 2026-02-16 17:52:10.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 17:52:11 compute-0 nova_compute[186176]: 2026-02-16 17:52:11.341 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:11 compute-0 nova_compute[186176]: 2026-02-16 17:52:11.398 186180 DEBUG nova.network.neutron [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Successfully updated port: 0653b14c-0e3a-4352-ac9c-52c00e138e2b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:52:11 compute-0 nova_compute[186176]: 2026-02-16 17:52:11.418 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Acquiring lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:52:11 compute-0 nova_compute[186176]: 2026-02-16 17:52:11.418 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Acquired lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:52:11 compute-0 nova_compute[186176]: 2026-02-16 17:52:11.418 186180 DEBUG nova.network.neutron [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:52:11 compute-0 nova_compute[186176]: 2026-02-16 17:52:11.531 186180 DEBUG nova.compute.manager [req-dfa010c2-9d35-4f90-a231-61612d1a175e req-531c75ad-3d34-4314-ac1a-d04465b69f5a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-changed-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:11 compute-0 nova_compute[186176]: 2026-02-16 17:52:11.533 186180 DEBUG nova.compute.manager [req-dfa010c2-9d35-4f90-a231-61612d1a175e req-531c75ad-3d34-4314-ac1a-d04465b69f5a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Refreshing instance network info cache due to event network-changed-0653b14c-0e3a-4352-ac9c-52c00e138e2b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:52:11 compute-0 nova_compute[186176]: 2026-02-16 17:52:11.533 186180 DEBUG oslo_concurrency.lockutils [req-dfa010c2-9d35-4f90-a231-61612d1a175e req-531c75ad-3d34-4314-ac1a-d04465b69f5a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:52:12 compute-0 nova_compute[186176]: 2026-02-16 17:52:12.024 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:12 compute-0 nova_compute[186176]: 2026-02-16 17:52:12.153 186180 DEBUG nova.network.neutron [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:52:13 compute-0 podman[216081]: 2026-02-16 17:52:13.108368223 +0000 UTC m=+0.077686533 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.346 186180 DEBUG nova.network.neutron [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Updating instance_info_cache with network_info: [{"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.376 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Releasing lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.377 186180 DEBUG nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Instance network_info: |[{"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.378 186180 DEBUG oslo_concurrency.lockutils [req-dfa010c2-9d35-4f90-a231-61612d1a175e req-531c75ad-3d34-4314-ac1a-d04465b69f5a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.378 186180 DEBUG nova.network.neutron [req-dfa010c2-9d35-4f90-a231-61612d1a175e req-531c75ad-3d34-4314-ac1a-d04465b69f5a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Refreshing network info cache for port 0653b14c-0e3a-4352-ac9c-52c00e138e2b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.383 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Start _get_guest_xml network_info=[{"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.390 186180 WARNING nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.400 186180 DEBUG nova.virt.libvirt.host [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.401 186180 DEBUG nova.virt.libvirt.host [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.407 186180 DEBUG nova.virt.libvirt.host [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.407 186180 DEBUG nova.virt.libvirt.host [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.409 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.410 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.411 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.411 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.412 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.412 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.412 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.413 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.413 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.414 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.414 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.415 186180 DEBUG nova.virt.hardware [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.421 186180 DEBUG nova.virt.libvirt.vif [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:52:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-2052325885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-2052325885',id=27,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d546004cb905437e918387be6c24c45b',ramdisk_id='',reservation_id='r-644siyl2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-29330499',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-29330499-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:52:06Z,user_data=None,user_id='8873acc2ce444a0e8eb100e6fdac8df7',uuid=0261deed-431b-4294-a4e7-e2a37fe3601b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.422 186180 DEBUG nova.network.os_vif_util [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Converting VIF {"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.423 186180 DEBUG nova.network.os_vif_util [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:10:15,bridge_name='br-int',has_traffic_filtering=True,id=0653b14c-0e3a-4352-ac9c-52c00e138e2b,network=Network(e1786415-20fc-4c7a-ab9f-da4b30eabed4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0653b14c-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.425 186180 DEBUG nova.objects.instance [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lazy-loading 'pci_devices' on Instance uuid 0261deed-431b-4294-a4e7-e2a37fe3601b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.449 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <uuid>0261deed-431b-4294-a4e7-e2a37fe3601b</uuid>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <name>instance-0000001b</name>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-2052325885</nova:name>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:52:13</nova:creationTime>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:52:13 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:52:13 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:52:13 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:52:13 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:52:13 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:52:13 compute-0 nova_compute[186176]:         <nova:user uuid="8873acc2ce444a0e8eb100e6fdac8df7">tempest-TestExecuteWorkloadBalancingStrategy-29330499-project-member</nova:user>
Feb 16 17:52:13 compute-0 nova_compute[186176]:         <nova:project uuid="d546004cb905437e918387be6c24c45b">tempest-TestExecuteWorkloadBalancingStrategy-29330499</nova:project>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:52:13 compute-0 nova_compute[186176]:         <nova:port uuid="0653b14c-0e3a-4352-ac9c-52c00e138e2b">
Feb 16 17:52:13 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <system>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <entry name="serial">0261deed-431b-4294-a4e7-e2a37fe3601b</entry>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <entry name="uuid">0261deed-431b-4294-a4e7-e2a37fe3601b</entry>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     </system>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <os>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   </os>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <features>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   </features>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk.config"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:6a:10:15"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <target dev="tap0653b14c-0e"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/console.log" append="off"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <video>
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     </video>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:52:13 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:52:13 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:52:13 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:52:13 compute-0 nova_compute[186176]: </domain>
Feb 16 17:52:13 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.450 186180 DEBUG nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Preparing to wait for external event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.451 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.451 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.452 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.453 186180 DEBUG nova.virt.libvirt.vif [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:52:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-2052325885',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-2052325885',id=27,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d546004cb905437e918387be6c24c45b',ramdisk_id='',reservation_id='r-644siyl2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-29330499',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-29330499-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:52:06Z,user_data=None,user_id='8873acc2ce444a0e8eb100e6fdac8df7',uuid=0261deed-431b-4294-a4e7-e2a37fe3601b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.453 186180 DEBUG nova.network.os_vif_util [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Converting VIF {"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.454 186180 DEBUG nova.network.os_vif_util [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:10:15,bridge_name='br-int',has_traffic_filtering=True,id=0653b14c-0e3a-4352-ac9c-52c00e138e2b,network=Network(e1786415-20fc-4c7a-ab9f-da4b30eabed4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0653b14c-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.455 186180 DEBUG os_vif [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:10:15,bridge_name='br-int',has_traffic_filtering=True,id=0653b14c-0e3a-4352-ac9c-52c00e138e2b,network=Network(e1786415-20fc-4c7a-ab9f-da4b30eabed4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0653b14c-0e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.456 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.456 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.457 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.461 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.461 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0653b14c-0e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.462 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0653b14c-0e, col_values=(('external_ids', {'iface-id': '0653b14c-0e3a-4352-ac9c-52c00e138e2b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:10:15', 'vm-uuid': '0261deed-431b-4294-a4e7-e2a37fe3601b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.465 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:13 compute-0 NetworkManager[56463]: <info>  [1771264333.4664] manager: (tap0653b14c-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.467 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.474 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.475 186180 INFO os_vif [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:10:15,bridge_name='br-int',has_traffic_filtering=True,id=0653b14c-0e3a-4352-ac9c-52c00e138e2b,network=Network(e1786415-20fc-4c7a-ab9f-da4b30eabed4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0653b14c-0e')
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.553 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.553 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.553 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] No VIF found with MAC fa:16:3e:6a:10:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:52:13 compute-0 nova_compute[186176]: 2026-02-16 17:52:13.554 186180 INFO nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Using config drive
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.274 186180 INFO nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Creating config drive at /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk.config
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.282 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqkuurxle execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.411 186180 DEBUG oslo_concurrency.processutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpqkuurxle" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:52:14 compute-0 kernel: tap0653b14c-0e: entered promiscuous mode
Feb 16 17:52:14 compute-0 ovn_controller[96437]: 2026-02-16T17:52:14Z|00216|binding|INFO|Claiming lport 0653b14c-0e3a-4352-ac9c-52c00e138e2b for this chassis.
Feb 16 17:52:14 compute-0 NetworkManager[56463]: <info>  [1771264334.4774] manager: (tap0653b14c-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Feb 16 17:52:14 compute-0 ovn_controller[96437]: 2026-02-16T17:52:14Z|00217|binding|INFO|0653b14c-0e3a-4352-ac9c-52c00e138e2b: Claiming fa:16:3e:6a:10:15 10.100.0.8
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.476 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.479 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.484 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.493 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:10:15 10.100.0.8'], port_security=['fa:16:3e:6a:10:15 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0261deed-431b-4294-a4e7-e2a37fe3601b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e1786415-20fc-4c7a-ab9f-da4b30eabed4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd546004cb905437e918387be6c24c45b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b7d6196-f606-495d-b6e3-1ed1668bf149', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4823df8f-69fb-4d1d-9566-74558cb58ee1, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=0653b14c-0e3a-4352-ac9c-52c00e138e2b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.496 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 0653b14c-0e3a-4352-ac9c-52c00e138e2b in datapath e1786415-20fc-4c7a-ab9f-da4b30eabed4 bound to our chassis
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.498 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e1786415-20fc-4c7a-ab9f-da4b30eabed4
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.503 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:14 compute-0 ovn_controller[96437]: 2026-02-16T17:52:14Z|00218|binding|INFO|Setting lport 0653b14c-0e3a-4352-ac9c-52c00e138e2b ovn-installed in OVS
Feb 16 17:52:14 compute-0 ovn_controller[96437]: 2026-02-16T17:52:14Z|00219|binding|INFO|Setting lport 0653b14c-0e3a-4352-ac9c-52c00e138e2b up in Southbound
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.512 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:14 compute-0 systemd-udevd[216125]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.514 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[90be24c2-10b7-4873-99b7-641fda03e96b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.516 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape1786415-21 in ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:52:14 compute-0 systemd-machined[155631]: New machine qemu-21-instance-0000001b.
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.519 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape1786415-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.519 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[8b940c51-e3b1-4af6-a2be-347a7c15ab1f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.521 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e7c8158e-cb10-4c31-bea4-1c844e37a0d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 NetworkManager[56463]: <info>  [1771264334.5313] device (tap0653b14c-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:52:14 compute-0 NetworkManager[56463]: <info>  [1771264334.5321] device (tap0653b14c-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.532 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[fe817086-57c0-4311-8407-6647571786f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001b.
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.550 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e0cc9aed-0c1f-403f-a9d7-c3794062432b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.581 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[f48b36c7-179f-40f3-85e1-ba9ef425711a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.586 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[c23761bd-9cb1-4abf-993a-137a9822910a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 NetworkManager[56463]: <info>  [1771264334.5874] manager: (tape1786415-20): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.626 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[306f9313-3b2a-4428-a8ca-55d8ee167ca5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.631 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[f6f6e3cf-ff29-4c87-b1cd-a3d231ac0e47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 NetworkManager[56463]: <info>  [1771264334.6574] device (tape1786415-20): carrier: link connected
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.660 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[0041b221-4705-44b0-8c7e-df7862e3e882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.680 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[534f3993-3e3f-483b-863d-91a64220a47a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape1786415-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:8b:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593018, 'reachable_time': 20099, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216157, 'error': None, 'target': 'ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.691 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[06eb0b63-dec4-4438-8fd2-534005868cd1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feed:8b1f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 593018, 'tstamp': 593018}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216158, 'error': None, 'target': 'ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.709 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[00bd50ae-45b4-44ac-a810-19b6cee4bcd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape1786415-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:8b:1f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593018, 'reachable_time': 20099, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216159, 'error': None, 'target': 'ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.743 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[27f44221-6e9e-452d-9917-8fcc0b1cfaf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.792 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[cfd1ca09-f57f-4a87-bd1c-10cc549757c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.794 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1786415-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.794 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.795 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape1786415-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:52:14 compute-0 NetworkManager[56463]: <info>  [1771264334.7993] manager: (tape1786415-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Feb 16 17:52:14 compute-0 kernel: tape1786415-20: entered promiscuous mode
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.798 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.802 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape1786415-20, col_values=(('external_ids', {'iface-id': '48ad4b43-2a67-4dc1-98f8-91ef87bb8781'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:52:14 compute-0 ovn_controller[96437]: 2026-02-16T17:52:14Z|00220|binding|INFO|Releasing lport 48ad4b43-2a67-4dc1-98f8-91ef87bb8781 from this chassis (sb_readonly=0)
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.807 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.811 186180 DEBUG nova.compute.manager [req-bdedc01b-447e-464e-a8f4-79c7be53396d req-53d99ecd-5e5c-4ffb-a16d-19c87ca5680c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.812 186180 DEBUG oslo_concurrency.lockutils [req-bdedc01b-447e-464e-a8f4-79c7be53396d req-53d99ecd-5e5c-4ffb-a16d-19c87ca5680c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.813 186180 DEBUG oslo_concurrency.lockutils [req-bdedc01b-447e-464e-a8f4-79c7be53396d req-53d99ecd-5e5c-4ffb-a16d-19c87ca5680c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.813 186180 DEBUG oslo_concurrency.lockutils [req-bdedc01b-447e-464e-a8f4-79c7be53396d req-53d99ecd-5e5c-4ffb-a16d-19c87ca5680c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.814 186180 DEBUG nova.compute.manager [req-bdedc01b-447e-464e-a8f4-79c7be53396d req-53d99ecd-5e5c-4ffb-a16d-19c87ca5680c 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Processing event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.815 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e1786415-20fc-4c7a-ab9f-da4b30eabed4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e1786415-20fc-4c7a-ab9f-da4b30eabed4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:52:14 compute-0 nova_compute[186176]: 2026-02-16 17:52:14.815 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.816 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[51cfba87-b5f4-41d9-a924-0277c66a24e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.817 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-e1786415-20fc-4c7a-ab9f-da4b30eabed4
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/e1786415-20fc-4c7a-ab9f-da4b30eabed4.pid.haproxy
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID e1786415-20fc-4c7a-ab9f-da4b30eabed4
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:52:14 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:14.819 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4', 'env', 'PROCESS_TAG=haproxy-e1786415-20fc-4c7a-ab9f-da4b30eabed4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e1786415-20fc-4c7a-ab9f-da4b30eabed4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.055 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.134 186180 DEBUG nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.136 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264335.1331408, 0261deed-431b-4294-a4e7-e2a37fe3601b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.136 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] VM Started (Lifecycle Event)
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.141 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.147 186180 INFO nova.virt.libvirt.driver [-] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Instance spawned successfully.
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.148 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:52:15 compute-0 podman[216196]: 2026-02-16 17:52:15.179705245 +0000 UTC m=+0.064119279 container create 30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.179 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.184 186180 DEBUG nova.network.neutron [req-dfa010c2-9d35-4f90-a231-61612d1a175e req-531c75ad-3d34-4314-ac1a-d04465b69f5a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Updated VIF entry in instance network info cache for port 0653b14c-0e3a-4352-ac9c-52c00e138e2b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.184 186180 DEBUG nova.network.neutron [req-dfa010c2-9d35-4f90-a231-61612d1a175e req-531c75ad-3d34-4314-ac1a-d04465b69f5a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Updating instance_info_cache with network_info: [{"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.203 186180 DEBUG oslo_concurrency.lockutils [req-dfa010c2-9d35-4f90-a231-61612d1a175e req-531c75ad-3d34-4314-ac1a-d04465b69f5a 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.206 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:52:15 compute-0 systemd[1]: Started libpod-conmon-30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5.scope.
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.212 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.213 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.214 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.215 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.216 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.216 186180 DEBUG nova.virt.libvirt.driver [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:52:15 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:52:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7c6ca801de086cccf34a2dfe507f347a0be7b92fbcbe06791c4581193e316fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:52:15 compute-0 podman[216196]: 2026-02-16 17:52:15.149762268 +0000 UTC m=+0.034176372 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:52:15 compute-0 podman[216196]: 2026-02-16 17:52:15.248041547 +0000 UTC m=+0.132455621 container init 30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 16 17:52:15 compute-0 podman[216196]: 2026-02-16 17:52:15.256392372 +0000 UTC m=+0.140806436 container start 30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 16 17:52:15 compute-0 neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4[216212]: [NOTICE]   (216225) : New worker (216235) forked
Feb 16 17:52:15 compute-0 neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4[216212]: [NOTICE]   (216225) : Loading success.
Feb 16 17:52:15 compute-0 podman[216208]: 2026-02-16 17:52:15.293411383 +0000 UTC m=+0.077579420 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.447 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.448 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264335.1333437, 0261deed-431b-4294-a4e7-e2a37fe3601b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.448 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] VM Paused (Lifecycle Event)
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.482 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.487 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264335.1386716, 0261deed-431b-4294-a4e7-e2a37fe3601b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.487 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] VM Resumed (Lifecycle Event)
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.496 186180 INFO nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Took 9.35 seconds to spawn the instance on the hypervisor.
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.497 186180 DEBUG nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.509 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.513 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.535 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.589 186180 INFO nova.compute.manager [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Took 9.85 seconds to build instance.
Feb 16 17:52:15 compute-0 nova_compute[186176]: 2026-02-16 17:52:15.607 186180 DEBUG oslo_concurrency.lockutils [None req-44720d5a-a876-41ee-a88b-9df94ec21d36 8873acc2ce444a0e8eb100e6fdac8df7 d546004cb905437e918387be6c24c45b - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.933s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:16 compute-0 nova_compute[186176]: 2026-02-16 17:52:16.887 186180 DEBUG nova.compute.manager [req-d8a1dfc0-1024-4c79-b8e6-b04b2abb724c req-7d0b23d6-fd60-4aee-b860-6868533b6cab 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:16 compute-0 nova_compute[186176]: 2026-02-16 17:52:16.888 186180 DEBUG oslo_concurrency.lockutils [req-d8a1dfc0-1024-4c79-b8e6-b04b2abb724c req-7d0b23d6-fd60-4aee-b860-6868533b6cab 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:16 compute-0 nova_compute[186176]: 2026-02-16 17:52:16.888 186180 DEBUG oslo_concurrency.lockutils [req-d8a1dfc0-1024-4c79-b8e6-b04b2abb724c req-7d0b23d6-fd60-4aee-b860-6868533b6cab 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:16 compute-0 nova_compute[186176]: 2026-02-16 17:52:16.889 186180 DEBUG oslo_concurrency.lockutils [req-d8a1dfc0-1024-4c79-b8e6-b04b2abb724c req-7d0b23d6-fd60-4aee-b860-6868533b6cab 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:16 compute-0 nova_compute[186176]: 2026-02-16 17:52:16.889 186180 DEBUG nova.compute.manager [req-d8a1dfc0-1024-4c79-b8e6-b04b2abb724c req-7d0b23d6-fd60-4aee-b860-6868533b6cab 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:16 compute-0 nova_compute[186176]: 2026-02-16 17:52:16.890 186180 WARNING nova.compute.manager [req-d8a1dfc0-1024-4c79-b8e6-b04b2abb724c req-7d0b23d6-fd60-4aee-b860-6868533b6cab 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received unexpected event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with vm_state active and task_state None.
Feb 16 17:52:18 compute-0 nova_compute[186176]: 2026-02-16 17:52:18.466 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:20 compute-0 nova_compute[186176]: 2026-02-16 17:52:20.057 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:22 compute-0 podman[216247]: 2026-02-16 17:52:22.112543005 +0000 UTC m=+0.085477794 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:52:22 compute-0 podman[216248]: 2026-02-16 17:52:22.114969535 +0000 UTC m=+0.086705985 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:52:23 compute-0 nova_compute[186176]: 2026-02-16 17:52:23.468 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:25 compute-0 nova_compute[186176]: 2026-02-16 17:52:25.060 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:26 compute-0 ovn_controller[96437]: 2026-02-16T17:52:26Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6a:10:15 10.100.0.8
Feb 16 17:52:26 compute-0 ovn_controller[96437]: 2026-02-16T17:52:26Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:10:15 10.100.0.8
Feb 16 17:52:28 compute-0 nova_compute[186176]: 2026-02-16 17:52:28.470 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:29 compute-0 podman[195505]: time="2026-02-16T17:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:52:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:52:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2643 "" "Go-http-client/1.1"
Feb 16 17:52:30 compute-0 nova_compute[186176]: 2026-02-16 17:52:30.062 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:31 compute-0 openstack_network_exporter[198360]: ERROR   17:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:52:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:52:31 compute-0 openstack_network_exporter[198360]: ERROR   17:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:52:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:52:33 compute-0 nova_compute[186176]: 2026-02-16 17:52:33.473 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:35 compute-0 nova_compute[186176]: 2026-02-16 17:52:35.065 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:36 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 16 17:52:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:38.186 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:38.187 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:38.187 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:38 compute-0 nova_compute[186176]: 2026-02-16 17:52:38.476 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:40 compute-0 nova_compute[186176]: 2026-02-16 17:52:40.068 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:41 compute-0 nova_compute[186176]: 2026-02-16 17:52:41.349 186180 DEBUG nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Check if temp file /var/lib/nova/instances/tmpas8nzl_8 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 17:52:41 compute-0 nova_compute[186176]: 2026-02-16 17:52:41.350 186180 DEBUG nova.compute.manager [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpas8nzl_8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0261deed-431b-4294-a4e7-e2a37fe3601b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 17:52:43 compute-0 nova_compute[186176]: 2026-02-16 17:52:43.090 186180 DEBUG oslo_concurrency.processutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:52:43 compute-0 nova_compute[186176]: 2026-02-16 17:52:43.137 186180 DEBUG oslo_concurrency.processutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:52:43 compute-0 nova_compute[186176]: 2026-02-16 17:52:43.138 186180 DEBUG oslo_concurrency.processutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:52:43 compute-0 nova_compute[186176]: 2026-02-16 17:52:43.216 186180 DEBUG oslo_concurrency.processutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:52:43 compute-0 nova_compute[186176]: 2026-02-16 17:52:43.478 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:44 compute-0 podman[216320]: 2026-02-16 17:52:44.124373161 +0000 UTC m=+0.079264552 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, version=9.7, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, distribution-scope=public)
Feb 16 17:52:45 compute-0 nova_compute[186176]: 2026-02-16 17:52:45.070 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:46 compute-0 podman[216343]: 2026-02-16 17:52:46.152162712 +0000 UTC m=+0.115617306 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:52:47 compute-0 sshd-session[216365]: Accepted publickey for nova from 192.168.122.101 port 56986 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:52:47 compute-0 systemd-logind[821]: New session 43 of user nova.
Feb 16 17:52:47 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 17:52:47 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 17:52:47 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 17:52:47 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 17:52:47 compute-0 systemd[216369]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:52:47 compute-0 systemd[216369]: Queued start job for default target Main User Target.
Feb 16 17:52:47 compute-0 systemd[216369]: Created slice User Application Slice.
Feb 16 17:52:47 compute-0 systemd[216369]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:52:47 compute-0 systemd[216369]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 17:52:47 compute-0 systemd[216369]: Reached target Paths.
Feb 16 17:52:47 compute-0 systemd[216369]: Reached target Timers.
Feb 16 17:52:47 compute-0 systemd[216369]: Starting D-Bus User Message Bus Socket...
Feb 16 17:52:47 compute-0 systemd[216369]: Starting Create User's Volatile Files and Directories...
Feb 16 17:52:47 compute-0 systemd[216369]: Finished Create User's Volatile Files and Directories.
Feb 16 17:52:47 compute-0 systemd[216369]: Listening on D-Bus User Message Bus Socket.
Feb 16 17:52:47 compute-0 systemd[216369]: Reached target Sockets.
Feb 16 17:52:47 compute-0 systemd[216369]: Reached target Basic System.
Feb 16 17:52:47 compute-0 systemd[216369]: Reached target Main User Target.
Feb 16 17:52:47 compute-0 systemd[216369]: Startup finished in 156ms.
Feb 16 17:52:47 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 17:52:47 compute-0 systemd[1]: Started Session 43 of User nova.
Feb 16 17:52:47 compute-0 sshd-session[216365]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:52:47 compute-0 sshd-session[216384]: Received disconnect from 192.168.122.101 port 56986:11: disconnected by user
Feb 16 17:52:47 compute-0 sshd-session[216384]: Disconnected from user nova 192.168.122.101 port 56986
Feb 16 17:52:47 compute-0 sshd-session[216365]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:52:47 compute-0 systemd[1]: session-43.scope: Deactivated successfully.
Feb 16 17:52:47 compute-0 systemd-logind[821]: Session 43 logged out. Waiting for processes to exit.
Feb 16 17:52:47 compute-0 systemd-logind[821]: Removed session 43.
Feb 16 17:52:48 compute-0 nova_compute[186176]: 2026-02-16 17:52:48.482 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:48 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:48.651 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:52:48 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:48.653 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:52:48 compute-0 nova_compute[186176]: 2026-02-16 17:52:48.652 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:48 compute-0 nova_compute[186176]: 2026-02-16 17:52:48.698 186180 DEBUG nova.compute.manager [req-571729ec-617b-4321-afa9-387e3185adf1 req-c3b7f1a9-bfb0-4c94-a4c7-1c878a4d1737 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-unplugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:48 compute-0 nova_compute[186176]: 2026-02-16 17:52:48.699 186180 DEBUG oslo_concurrency.lockutils [req-571729ec-617b-4321-afa9-387e3185adf1 req-c3b7f1a9-bfb0-4c94-a4c7-1c878a4d1737 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:48 compute-0 nova_compute[186176]: 2026-02-16 17:52:48.699 186180 DEBUG oslo_concurrency.lockutils [req-571729ec-617b-4321-afa9-387e3185adf1 req-c3b7f1a9-bfb0-4c94-a4c7-1c878a4d1737 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:48 compute-0 nova_compute[186176]: 2026-02-16 17:52:48.699 186180 DEBUG oslo_concurrency.lockutils [req-571729ec-617b-4321-afa9-387e3185adf1 req-c3b7f1a9-bfb0-4c94-a4c7-1c878a4d1737 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:48 compute-0 nova_compute[186176]: 2026-02-16 17:52:48.700 186180 DEBUG nova.compute.manager [req-571729ec-617b-4321-afa9-387e3185adf1 req-c3b7f1a9-bfb0-4c94-a4c7-1c878a4d1737 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-unplugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:48 compute-0 nova_compute[186176]: 2026-02-16 17:52:48.700 186180 DEBUG nova.compute.manager [req-571729ec-617b-4321-afa9-387e3185adf1 req-c3b7f1a9-bfb0-4c94-a4c7-1c878a4d1737 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-unplugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.511 186180 INFO nova.compute.manager [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Took 6.29 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.512 186180 DEBUG nova.compute.manager [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.550 186180 DEBUG nova.compute.manager [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpas8nzl_8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0261deed-431b-4294-a4e7-e2a37fe3601b',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(2b9f1ac5-85ad-4490-9405-d8cb4c687145),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.580 186180 DEBUG nova.objects.instance [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lazy-loading 'migration_context' on Instance uuid 0261deed-431b-4294-a4e7-e2a37fe3601b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.582 186180 DEBUG nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.584 186180 DEBUG nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.584 186180 DEBUG nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.612 186180 DEBUG nova.virt.libvirt.vif [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:52:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-2052325885',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-2052325885',id=27,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:52:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d546004cb905437e918387be6c24c45b',ramdisk_id='',reservation_id='r-644siyl2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-29330499',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-29330499-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:52:15Z,user_data=None,user_id='8873acc2ce444a0e8eb100e6fdac8df7',uuid=0261deed-431b-4294-a4e7-e2a37fe3601b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.613 186180 DEBUG nova.network.os_vif_util [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Converting VIF {"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.614 186180 DEBUG nova.network.os_vif_util [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:10:15,bridge_name='br-int',has_traffic_filtering=True,id=0653b14c-0e3a-4352-ac9c-52c00e138e2b,network=Network(e1786415-20fc-4c7a-ab9f-da4b30eabed4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0653b14c-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.614 186180 DEBUG nova.virt.libvirt.migration [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 17:52:49 compute-0 nova_compute[186176]:   <mac address="fa:16:3e:6a:10:15"/>
Feb 16 17:52:49 compute-0 nova_compute[186176]:   <model type="virtio"/>
Feb 16 17:52:49 compute-0 nova_compute[186176]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:52:49 compute-0 nova_compute[186176]:   <mtu size="1442"/>
Feb 16 17:52:49 compute-0 nova_compute[186176]:   <target dev="tap0653b14c-0e"/>
Feb 16 17:52:49 compute-0 nova_compute[186176]: </interface>
Feb 16 17:52:49 compute-0 nova_compute[186176]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 17:52:49 compute-0 nova_compute[186176]: 2026-02-16 17:52:49.615 186180 DEBUG nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.074 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.091 186180 DEBUG nova.virt.libvirt.migration [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.092 186180 INFO nova.virt.libvirt.migration [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.197 186180 INFO nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.701 186180 DEBUG nova.virt.libvirt.migration [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.702 186180 DEBUG nova.virt.libvirt.migration [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.784 186180 DEBUG nova.compute.manager [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.784 186180 DEBUG oslo_concurrency.lockutils [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.785 186180 DEBUG oslo_concurrency.lockutils [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.785 186180 DEBUG oslo_concurrency.lockutils [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.786 186180 DEBUG nova.compute.manager [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.786 186180 WARNING nova.compute.manager [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received unexpected event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with vm_state active and task_state migrating.
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.786 186180 DEBUG nova.compute.manager [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-changed-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.787 186180 DEBUG nova.compute.manager [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Refreshing instance network info cache due to event network-changed-0653b14c-0e3a-4352-ac9c-52c00e138e2b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.787 186180 DEBUG oslo_concurrency.lockutils [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.787 186180 DEBUG oslo_concurrency.lockutils [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:52:50 compute-0 nova_compute[186176]: 2026-02-16 17:52:50.788 186180 DEBUG nova.network.neutron [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Refreshing network info cache for port 0653b14c-0e3a-4352-ac9c-52c00e138e2b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.126 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264371.1263814, 0261deed-431b-4294-a4e7-e2a37fe3601b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.127 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] VM Paused (Lifecycle Event)
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.206 186180 DEBUG nova.virt.libvirt.migration [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.206 186180 DEBUG nova.virt.libvirt.migration [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 16 17:52:51 compute-0 kernel: tap0653b14c-0e (unregistering): left promiscuous mode
Feb 16 17:52:51 compute-0 NetworkManager[56463]: <info>  [1771264371.3141] device (tap0653b14c-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00221|binding|INFO|Releasing lport 0653b14c-0e3a-4352-ac9c-52c00e138e2b from this chassis (sb_readonly=0)
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.322 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00222|binding|INFO|Setting lport 0653b14c-0e3a-4352-ac9c-52c00e138e2b down in Southbound
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00223|binding|INFO|Removing iface tap0653b14c-0e ovn-installed in OVS
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.325 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.337 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:51 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Feb 16 17:52:51 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001b.scope: Consumed 13.428s CPU time.
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.379 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:52:51 compute-0 systemd-machined[155631]: Machine qemu-21-instance-0000001b terminated.
Feb 16 17:52:51 compute-0 virtqemud[185389]: cannot parse process status data
Feb 16 17:52:51 compute-0 kernel: tap0653b14c-0e: entered promiscuous mode
Feb 16 17:52:51 compute-0 kernel: tap0653b14c-0e (unregistering): left promiscuous mode
Feb 16 17:52:51 compute-0 NetworkManager[56463]: <info>  [1771264371.5146] manager: (tap0653b14c-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.517 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00224|binding|INFO|Releasing lport 48ad4b43-2a67-4dc1-98f8-91ef87bb8781 from this chassis (sb_readonly=0)
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00225|binding|INFO|Claiming lport 0653b14c-0e3a-4352-ac9c-52c00e138e2b for this chassis.
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00226|binding|INFO|0653b14c-0e3a-4352-ac9c-52c00e138e2b: Claiming fa:16:3e:6a:10:15 10.100.0.8
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.530 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:10:15 10.100.0.8'], port_security=['fa:16:3e:6a:10:15 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0261deed-431b-4294-a4e7-e2a37fe3601b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e1786415-20fc-4c7a-ab9f-da4b30eabed4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd546004cb905437e918387be6c24c45b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1b7d6196-f606-495d-b6e3-1ed1668bf149', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4823df8f-69fb-4d1d-9566-74558cb58ee1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=0653b14c-0e3a-4352-ac9c-52c00e138e2b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.532 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 0653b14c-0e3a-4352-ac9c-52c00e138e2b in datapath e1786415-20fc-4c7a-ab9f-da4b30eabed4 unbound from our chassis
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00227|binding|INFO|Setting lport 0653b14c-0e3a-4352-ac9c-52c00e138e2b ovn-installed in OVS
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.535 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.535 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e1786415-20fc-4c7a-ab9f-da4b30eabed4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.538 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[350e881b-5534-4ee2-86c2-fdf5dbb2d66a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.539 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4 namespace which is not needed anymore
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.562 186180 DEBUG nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.562 186180 DEBUG nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.562 186180 DEBUG nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00228|binding|INFO|Setting lport 0653b14c-0e3a-4352-ac9c-52c00e138e2b up in Southbound
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.604 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:10:15 10.100.0.8'], port_security=['fa:16:3e:6a:10:15 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0261deed-431b-4294-a4e7-e2a37fe3601b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e1786415-20fc-4c7a-ab9f-da4b30eabed4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd546004cb905437e918387be6c24c45b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1b7d6196-f606-495d-b6e3-1ed1668bf149', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4823df8f-69fb-4d1d-9566-74558cb58ee1, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=0653b14c-0e3a-4352-ac9c-52c00e138e2b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.709 186180 DEBUG nova.virt.libvirt.guest [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '0261deed-431b-4294-a4e7-e2a37fe3601b' (instance-0000001b) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.710 186180 INFO nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Migration operation has completed
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.710 186180 INFO nova.compute.manager [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] _post_live_migration() is started..
Feb 16 17:52:51 compute-0 neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4[216212]: [NOTICE]   (216225) : haproxy version is 2.8.14-c23fe91
Feb 16 17:52:51 compute-0 neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4[216212]: [NOTICE]   (216225) : path to executable is /usr/sbin/haproxy
Feb 16 17:52:51 compute-0 neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4[216212]: [WARNING]  (216225) : Exiting Master process...
Feb 16 17:52:51 compute-0 neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4[216212]: [ALERT]    (216225) : Current worker (216235) exited with code 143 (Terminated)
Feb 16 17:52:51 compute-0 neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4[216212]: [WARNING]  (216225) : All workers exited. Exiting... (0)
Feb 16 17:52:51 compute-0 systemd[1]: libpod-30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5.scope: Deactivated successfully.
Feb 16 17:52:51 compute-0 podman[216434]: 2026-02-16 17:52:51.731714659 +0000 UTC m=+0.060328086 container died 30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:52:51 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5-userdata-shm.mount: Deactivated successfully.
Feb 16 17:52:51 compute-0 systemd[1]: var-lib-containers-storage-overlay-e7c6ca801de086cccf34a2dfe507f347a0be7b92fbcbe06791c4581193e316fd-merged.mount: Deactivated successfully.
Feb 16 17:52:51 compute-0 podman[216434]: 2026-02-16 17:52:51.782190911 +0000 UTC m=+0.110804348 container cleanup 30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 17:52:51 compute-0 systemd[1]: libpod-conmon-30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5.scope: Deactivated successfully.
Feb 16 17:52:51 compute-0 podman[216464]: 2026-02-16 17:52:51.86548375 +0000 UTC m=+0.057626759 container remove 30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.871 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[570066a1-f8c4-4a43-a97d-c0d03ccc1812]: (4, ('Mon Feb 16 05:52:51 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4 (30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5)\n30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5\nMon Feb 16 05:52:51 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4 (30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5)\n30796a043eb4ce53a78482cc74a47fd6b5b4215b08720fb535bcfb2ff73f38d5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.873 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[bfdec91b-c676-4389-b5c4-c1514f6f752f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.875 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1786415-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.878 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:51 compute-0 kernel: tape1786415-20: left promiscuous mode
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.892 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00229|binding|INFO|Releasing lport 0653b14c-0e3a-4352-ac9c-52c00e138e2b from this chassis (sb_readonly=0)
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00230|binding|INFO|Setting lport 0653b14c-0e3a-4352-ac9c-52c00e138e2b down in Southbound
Feb 16 17:52:51 compute-0 ovn_controller[96437]: 2026-02-16T17:52:51Z|00231|binding|INFO|Removing iface tap0653b14c-0e ovn-installed in OVS
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.897 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.898 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e6190cf1-3c8d-4a45-b7a6-a284a122e107]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.908 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:10:15 10.100.0.8'], port_security=['fa:16:3e:6a:10:15 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '0261deed-431b-4294-a4e7-e2a37fe3601b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e1786415-20fc-4c7a-ab9f-da4b30eabed4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd546004cb905437e918387be6c24c45b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '1b7d6196-f606-495d-b6e3-1ed1668bf149', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4823df8f-69fb-4d1d-9566-74558cb58ee1, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=0653b14c-0e3a-4352-ac9c-52c00e138e2b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:52:51 compute-0 nova_compute[186176]: 2026-02-16 17:52:51.908 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.913 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[ed97e02f-edf8-4436-a890-78e053166c59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.916 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[6813317a-208c-48bb-8843-4f7de631e46a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.936 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[5c6096f6-96bc-4bd2-8fce-2c2e46ce1fa5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 593010, 'reachable_time': 23904, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216482, 'error': None, 'target': 'ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.939 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e1786415-20fc-4c7a-ab9f-da4b30eabed4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.939 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba14440-af17-4bb0-9ddf-1604a643810e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:51 compute-0 systemd[1]: run-netns-ovnmeta\x2de1786415\x2d20fc\x2d4c7a\x2dab9f\x2dda4b30eabed4.mount: Deactivated successfully.
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.940 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 0653b14c-0e3a-4352-ac9c-52c00e138e2b in datapath e1786415-20fc-4c7a-ab9f-da4b30eabed4 unbound from our chassis
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.942 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e1786415-20fc-4c7a-ab9f-da4b30eabed4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.943 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[6d46cf4d-7209-466a-b98f-e47fd8794994]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.944 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 0653b14c-0e3a-4352-ac9c-52c00e138e2b in datapath e1786415-20fc-4c7a-ab9f-da4b30eabed4 unbound from our chassis
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.946 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e1786415-20fc-4c7a-ab9f-da4b30eabed4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:52:51 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:51.947 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[c0430635-98c4-4988-915e-d74f30b8fb20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.390 186180 DEBUG nova.network.neutron [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Updated VIF entry in instance network info cache for port 0653b14c-0e3a-4352-ac9c-52c00e138e2b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.391 186180 DEBUG nova.network.neutron [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Updating instance_info_cache with network_info: [{"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.413 186180 DEBUG oslo_concurrency.lockutils [req-296c5987-d17e-4a68-8d13-02b4bf01f953 req-55fbd5e7-8e08-48b8-a43d-c4c953b4eb6e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.517 186180 DEBUG nova.network.neutron [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Activated binding for port 0653b14c-0e3a-4352-ac9c-52c00e138e2b and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.518 186180 DEBUG nova.compute.manager [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.519 186180 DEBUG nova.virt.libvirt.vif [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:52:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-2052325885',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-2052325885',id=27,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:52:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d546004cb905437e918387be6c24c45b',ramdisk_id='',reservation_id='r-644siyl2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-29330499',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-29330499-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:52:38Z,user_data=None,user_id='8873acc2ce444a0e8eb100e6fdac8df7',uuid=0261deed-431b-4294-a4e7-e2a37fe3601b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.520 186180 DEBUG nova.network.os_vif_util [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Converting VIF {"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.521 186180 DEBUG nova.network.os_vif_util [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:10:15,bridge_name='br-int',has_traffic_filtering=True,id=0653b14c-0e3a-4352-ac9c-52c00e138e2b,network=Network(e1786415-20fc-4c7a-ab9f-da4b30eabed4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0653b14c-0e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.522 186180 DEBUG os_vif [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:10:15,bridge_name='br-int',has_traffic_filtering=True,id=0653b14c-0e3a-4352-ac9c-52c00e138e2b,network=Network(e1786415-20fc-4c7a-ab9f-da4b30eabed4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0653b14c-0e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.524 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.525 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0653b14c-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.527 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.529 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.534 186180 INFO os_vif [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:10:15,bridge_name='br-int',has_traffic_filtering=True,id=0653b14c-0e3a-4352-ac9c-52c00e138e2b,network=Network(e1786415-20fc-4c7a-ab9f-da4b30eabed4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0653b14c-0e')
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.535 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.535 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.536 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.536 186180 DEBUG nova.compute.manager [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.537 186180 INFO nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Deleting instance files /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b_del
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.538 186180 INFO nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Deletion of /var/lib/nova/instances/0261deed-431b-4294-a4e7-e2a37fe3601b_del complete
Feb 16 17:52:52 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:52:52.656 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.870 186180 DEBUG nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-unplugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.870 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.871 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.871 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.871 186180 DEBUG nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-unplugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.872 186180 DEBUG nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-unplugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.872 186180 DEBUG nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.872 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.872 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.873 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.873 186180 DEBUG nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.873 186180 WARNING nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received unexpected event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with vm_state active and task_state migrating.
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.873 186180 DEBUG nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.874 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.874 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.874 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.874 186180 DEBUG nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.875 186180 WARNING nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received unexpected event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with vm_state active and task_state migrating.
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.875 186180 DEBUG nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-unplugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.875 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.875 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.876 186180 DEBUG oslo_concurrency.lockutils [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.876 186180 DEBUG nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-unplugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:52 compute-0 nova_compute[186176]: 2026-02-16 17:52:52.876 186180 DEBUG nova.compute.manager [req-ff92dd60-7263-441b-b81b-786a03990686 req-eb51b8a7-e757-4745-b6e6-56f4e085ac0f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-unplugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:52:53 compute-0 podman[216484]: 2026-02-16 17:52:53.133358839 +0000 UTC m=+0.086323640 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:52:53 compute-0 podman[216483]: 2026-02-16 17:52:53.175120684 +0000 UTC m=+0.130545845 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.981 186180 DEBUG nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.982 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.982 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.982 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.983 186180 DEBUG nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.983 186180 WARNING nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received unexpected event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with vm_state active and task_state migrating.
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.984 186180 DEBUG nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.984 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.985 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.986 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.986 186180 DEBUG nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.987 186180 WARNING nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received unexpected event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with vm_state active and task_state migrating.
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.987 186180 DEBUG nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.988 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.988 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.988 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.989 186180 DEBUG nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.989 186180 WARNING nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received unexpected event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with vm_state active and task_state migrating.
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.990 186180 DEBUG nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.990 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.991 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.991 186180 DEBUG oslo_concurrency.lockutils [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.991 186180 DEBUG nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] No waiting events found dispatching network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:52:54 compute-0 nova_compute[186176]: 2026-02-16 17:52:54.992 186180 WARNING nova.compute.manager [req-e98716ff-022e-4853-821f-671ebf0698cf req-72d8786e-bbfd-41f9-a524-73cf6325be1f 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Received unexpected event network-vif-plugged-0653b14c-0e3a-4352-ac9c-52c00e138e2b for instance with vm_state active and task_state migrating.
Feb 16 17:52:55 compute-0 nova_compute[186176]: 2026-02-16 17:52:55.075 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:57 compute-0 nova_compute[186176]: 2026-02-16 17:52:57.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:57 compute-0 nova_compute[186176]: 2026-02-16 17:52:57.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:52:57 compute-0 nova_compute[186176]: 2026-02-16 17:52:57.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:52:57 compute-0 nova_compute[186176]: 2026-02-16 17:52:57.419 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:52:57 compute-0 nova_compute[186176]: 2026-02-16 17:52:57.419 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquired lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:52:57 compute-0 nova_compute[186176]: 2026-02-16 17:52:57.419 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 16 17:52:57 compute-0 nova_compute[186176]: 2026-02-16 17:52:57.420 186180 DEBUG nova.objects.instance [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0261deed-431b-4294-a4e7-e2a37fe3601b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:52:57 compute-0 nova_compute[186176]: 2026-02-16 17:52:57.527 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:52:58 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 17:52:58 compute-0 systemd[216369]: Activating special unit Exit the Session...
Feb 16 17:52:58 compute-0 systemd[216369]: Stopped target Main User Target.
Feb 16 17:52:58 compute-0 systemd[216369]: Stopped target Basic System.
Feb 16 17:52:58 compute-0 systemd[216369]: Stopped target Paths.
Feb 16 17:52:58 compute-0 systemd[216369]: Stopped target Sockets.
Feb 16 17:52:58 compute-0 systemd[216369]: Stopped target Timers.
Feb 16 17:52:58 compute-0 systemd[216369]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:52:58 compute-0 systemd[216369]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 17:52:58 compute-0 systemd[216369]: Closed D-Bus User Message Bus Socket.
Feb 16 17:52:58 compute-0 systemd[216369]: Stopped Create User's Volatile Files and Directories.
Feb 16 17:52:58 compute-0 systemd[216369]: Removed slice User Application Slice.
Feb 16 17:52:58 compute-0 systemd[216369]: Reached target Shutdown.
Feb 16 17:52:58 compute-0 systemd[216369]: Finished Exit the Session.
Feb 16 17:52:58 compute-0 systemd[216369]: Reached target Exit the Session.
Feb 16 17:52:58 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 17:52:58 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 17:52:58 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 17:52:58 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 17:52:58 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 17:52:58 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 17:52:58 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 17:52:59 compute-0 nova_compute[186176]: 2026-02-16 17:52:59.598 186180 DEBUG nova.network.neutron [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Updating instance_info_cache with network_info: [{"id": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "address": "fa:16:3e:6a:10:15", "network": {"id": "e1786415-20fc-4c7a-ab9f-da4b30eabed4", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-1231223498-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d546004cb905437e918387be6c24c45b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0653b14c-0e", "ovs_interfaceid": "0653b14c-0e3a-4352-ac9c-52c00e138e2b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:52:59 compute-0 nova_compute[186176]: 2026-02-16 17:52:59.632 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Releasing lock "refresh_cache-0261deed-431b-4294-a4e7-e2a37fe3601b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:52:59 compute-0 nova_compute[186176]: 2026-02-16 17:52:59.633 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 16 17:52:59 compute-0 nova_compute[186176]: 2026-02-16 17:52:59.634 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:59 compute-0 nova_compute[186176]: 2026-02-16 17:52:59.634 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:59 compute-0 nova_compute[186176]: 2026-02-16 17:52:59.634 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:52:59 compute-0 nova_compute[186176]: 2026-02-16 17:52:59.635 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:52:59 compute-0 podman[195505]: time="2026-02-16T17:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:52:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:52:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.078 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.426 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Acquiring lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.426 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.427 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "0261deed-431b-4294-a4e7-e2a37fe3601b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.459 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.459 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.460 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.460 186180 DEBUG nova.compute.resource_tracker [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.687 186180 WARNING nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.689 186180 DEBUG nova.compute.resource_tracker [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5768MB free_disk=73.22279357910156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.689 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.690 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.727 186180 DEBUG nova.compute.resource_tracker [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Migration for instance 0261deed-431b-4294-a4e7-e2a37fe3601b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.746 186180 DEBUG nova.compute.resource_tracker [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.843 186180 DEBUG nova.compute.resource_tracker [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Migration 2b9f1ac5-85ad-4490-9405-d8cb4c687145 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.843 186180 DEBUG nova.compute.resource_tracker [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.844 186180 DEBUG nova.compute.resource_tracker [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.912 186180 DEBUG nova.compute.provider_tree [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.930 186180 DEBUG nova.scheduler.client.report [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.953 186180 DEBUG nova.compute.resource_tracker [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.953 186180 DEBUG oslo_concurrency.lockutils [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:53:00 compute-0 nova_compute[186176]: 2026-02-16 17:53:00.961 186180 INFO nova.compute.manager [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 17:53:01 compute-0 nova_compute[186176]: 2026-02-16 17:53:01.037 186180 INFO nova.scheduler.client.report [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Deleted allocation for migration 2b9f1ac5-85ad-4490-9405-d8cb4c687145
Feb 16 17:53:01 compute-0 nova_compute[186176]: 2026-02-16 17:53:01.038 186180 DEBUG nova.virt.libvirt.driver [None req-81bbba29-ee12-4a0b-80ac-3925885a48b5 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 17:53:01 compute-0 openstack_network_exporter[198360]: ERROR   17:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:53:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:53:01 compute-0 openstack_network_exporter[198360]: ERROR   17:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:53:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:53:01 compute-0 nova_compute[186176]: 2026-02-16 17:53:01.629 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.337 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.338 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.339 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.339 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.529 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.545 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.546 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5771MB free_disk=73.22279357910156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.546 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.547 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.594 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.594 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.612 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.624 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.627 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:53:02 compute-0 nova_compute[186176]: 2026-02-16 17:53:02.627 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:53:04 compute-0 nova_compute[186176]: 2026-02-16 17:53:04.624 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:53:05 compute-0 nova_compute[186176]: 2026-02-16 17:53:05.080 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:06 compute-0 nova_compute[186176]: 2026-02-16 17:53:06.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:53:06 compute-0 nova_compute[186176]: 2026-02-16 17:53:06.561 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771264371.559491, 0261deed-431b-4294-a4e7-e2a37fe3601b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:53:06 compute-0 nova_compute[186176]: 2026-02-16 17:53:06.561 186180 INFO nova.compute.manager [-] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] VM Stopped (Lifecycle Event)
Feb 16 17:53:06 compute-0 nova_compute[186176]: 2026-02-16 17:53:06.579 186180 DEBUG nova.compute.manager [None req-2176724e-0608-4669-9ac0-54dfdee5e148 - - - - - -] [instance: 0261deed-431b-4294-a4e7-e2a37fe3601b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:53:07 compute-0 nova_compute[186176]: 2026-02-16 17:53:07.531 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:09 compute-0 nova_compute[186176]: 2026-02-16 17:53:09.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:53:10 compute-0 nova_compute[186176]: 2026-02-16 17:53:10.083 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:11 compute-0 nova_compute[186176]: 2026-02-16 17:53:11.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:53:12 compute-0 nova_compute[186176]: 2026-02-16 17:53:12.534 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:15 compute-0 nova_compute[186176]: 2026-02-16 17:53:15.087 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:15 compute-0 podman[216536]: 2026-02-16 17:53:15.098199496 +0000 UTC m=+0.072040519 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, release=1770267347, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 16 17:53:17 compute-0 podman[216559]: 2026-02-16 17:53:17.083160731 +0000 UTC m=+0.052958041 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 16 17:53:17 compute-0 nova_compute[186176]: 2026-02-16 17:53:17.536 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:20 compute-0 nova_compute[186176]: 2026-02-16 17:53:20.092 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:21 compute-0 ovn_controller[96437]: 2026-02-16T17:53:21Z|00232|memory_trim|INFO|Detected inactivity (last active 30021 ms ago): trimming memory
Feb 16 17:53:22 compute-0 nova_compute[186176]: 2026-02-16 17:53:22.539 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:24 compute-0 podman[216579]: 2026-02-16 17:53:24.102997974 +0000 UTC m=+0.066650037 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:53:24 compute-0 podman[216578]: 2026-02-16 17:53:24.137308396 +0000 UTC m=+0.104969468 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:53:25 compute-0 nova_compute[186176]: 2026-02-16 17:53:25.098 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:27 compute-0 nova_compute[186176]: 2026-02-16 17:53:27.541 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:29 compute-0 podman[195505]: time="2026-02-16T17:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:53:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:53:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Feb 16 17:53:30 compute-0 nova_compute[186176]: 2026-02-16 17:53:30.098 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:31 compute-0 openstack_network_exporter[198360]: ERROR   17:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:53:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:53:31 compute-0 openstack_network_exporter[198360]: ERROR   17:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:53:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:53:32 compute-0 nova_compute[186176]: 2026-02-16 17:53:32.544 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:35 compute-0 nova_compute[186176]: 2026-02-16 17:53:35.100 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:37 compute-0 nova_compute[186176]: 2026-02-16 17:53:37.547 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:53:38.188 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:53:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:53:38.188 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:53:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:53:38.189 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:53:40 compute-0 nova_compute[186176]: 2026-02-16 17:53:40.103 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:42 compute-0 nova_compute[186176]: 2026-02-16 17:53:42.550 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:45 compute-0 nova_compute[186176]: 2026-02-16 17:53:45.105 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:46 compute-0 podman[216626]: 2026-02-16 17:53:46.08613721 +0000 UTC m=+0.051926766 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, vcs-type=git, version=9.7, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 16 17:53:47 compute-0 nova_compute[186176]: 2026-02-16 17:53:47.552 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:48 compute-0 podman[216647]: 2026-02-16 17:53:48.107739354 +0000 UTC m=+0.076636432 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 16 17:53:50 compute-0 nova_compute[186176]: 2026-02-16 17:53:50.108 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:52 compute-0 nova_compute[186176]: 2026-02-16 17:53:52.554 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:54 compute-0 ovn_controller[96437]: 2026-02-16T17:53:54Z|00233|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Feb 16 17:53:55 compute-0 podman[216667]: 2026-02-16 17:53:55.093724806 +0000 UTC m=+0.062257549 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:53:55 compute-0 nova_compute[186176]: 2026-02-16 17:53:55.109 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:55 compute-0 podman[216666]: 2026-02-16 17:53:55.12240194 +0000 UTC m=+0.093674790 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:53:57 compute-0 nova_compute[186176]: 2026-02-16 17:53:57.556 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:53:58 compute-0 nova_compute[186176]: 2026-02-16 17:53:58.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:53:58 compute-0 nova_compute[186176]: 2026-02-16 17:53:58.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:53:58 compute-0 nova_compute[186176]: 2026-02-16 17:53:58.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:53:58 compute-0 nova_compute[186176]: 2026-02-16 17:53:58.354 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:53:59 compute-0 nova_compute[186176]: 2026-02-16 17:53:59.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:53:59 compute-0 nova_compute[186176]: 2026-02-16 17:53:59.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:53:59 compute-0 nova_compute[186176]: 2026-02-16 17:53:59.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:53:59 compute-0 podman[195505]: time="2026-02-16T17:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:53:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:53:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Feb 16 17:54:00 compute-0 nova_compute[186176]: 2026-02-16 17:54:00.110 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:00 compute-0 nova_compute[186176]: 2026-02-16 17:54:00.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:54:01 compute-0 openstack_network_exporter[198360]: ERROR   17:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:54:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:54:01 compute-0 openstack_network_exporter[198360]: ERROR   17:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:54:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:54:01 compute-0 anacron[39749]: Job `cron.monthly' started
Feb 16 17:54:01 compute-0 anacron[39749]: Job `cron.monthly' terminated
Feb 16 17:54:01 compute-0 anacron[39749]: Normal exit (3 jobs run)
Feb 16 17:54:02 compute-0 nova_compute[186176]: 2026-02-16 17:54:02.562 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.341 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.342 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.343 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.554 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.556 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5796MB free_disk=73.2227897644043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.556 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.556 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.615 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.616 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.644 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.662 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.664 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:54:04 compute-0 nova_compute[186176]: 2026-02-16 17:54:04.665 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:54:05 compute-0 nova_compute[186176]: 2026-02-16 17:54:05.115 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:06 compute-0 nova_compute[186176]: 2026-02-16 17:54:06.660 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:54:06 compute-0 nova_compute[186176]: 2026-02-16 17:54:06.661 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:54:07 compute-0 nova_compute[186176]: 2026-02-16 17:54:07.564 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:09 compute-0 nova_compute[186176]: 2026-02-16 17:54:09.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:54:10 compute-0 nova_compute[186176]: 2026-02-16 17:54:10.116 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:11 compute-0 nova_compute[186176]: 2026-02-16 17:54:11.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:54:12 compute-0 nova_compute[186176]: 2026-02-16 17:54:12.571 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:15 compute-0 nova_compute[186176]: 2026-02-16 17:54:15.120 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:15 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:54:15.255 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:54:15 compute-0 nova_compute[186176]: 2026-02-16 17:54:15.256 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:15 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:54:15.257 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:54:17 compute-0 podman[216718]: 2026-02-16 17:54:17.113742176 +0000 UTC m=+0.079014170 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, version=9.7, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 17:54:17 compute-0 nova_compute[186176]: 2026-02-16 17:54:17.574 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:18 compute-0 nova_compute[186176]: 2026-02-16 17:54:18.599 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:19 compute-0 podman[216740]: 2026-02-16 17:54:19.106008202 +0000 UTC m=+0.073863775 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 16 17:54:19 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:54:19.260 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:54:20 compute-0 nova_compute[186176]: 2026-02-16 17:54:20.121 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:22 compute-0 nova_compute[186176]: 2026-02-16 17:54:22.576 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:25 compute-0 nova_compute[186176]: 2026-02-16 17:54:25.124 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:26 compute-0 podman[216762]: 2026-02-16 17:54:26.129017552 +0000 UTC m=+0.084817533 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 17:54:26 compute-0 podman[216761]: 2026-02-16 17:54:26.166371848 +0000 UTC m=+0.120948979 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 16 17:54:26 compute-0 sshd-session[216759]: Invalid user sol from 2.57.122.210 port 33016
Feb 16 17:54:26 compute-0 sshd-session[216759]: Connection closed by invalid user sol 2.57.122.210 port 33016 [preauth]
Feb 16 17:54:27 compute-0 nova_compute[186176]: 2026-02-16 17:54:27.578 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:29 compute-0 podman[195505]: time="2026-02-16T17:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:54:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:54:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 17:54:30 compute-0 nova_compute[186176]: 2026-02-16 17:54:30.128 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:31 compute-0 openstack_network_exporter[198360]: ERROR   17:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:54:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:54:31 compute-0 openstack_network_exporter[198360]: ERROR   17:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:54:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:54:32 compute-0 nova_compute[186176]: 2026-02-16 17:54:32.583 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:35 compute-0 nova_compute[186176]: 2026-02-16 17:54:35.132 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:37 compute-0 nova_compute[186176]: 2026-02-16 17:54:37.589 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:54:38.189 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:54:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:54:38.190 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:54:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:54:38.190 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:54:40 compute-0 nova_compute[186176]: 2026-02-16 17:54:40.134 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:42 compute-0 nova_compute[186176]: 2026-02-16 17:54:42.592 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:45 compute-0 nova_compute[186176]: 2026-02-16 17:54:45.137 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:47 compute-0 nova_compute[186176]: 2026-02-16 17:54:47.594 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:48 compute-0 podman[216811]: 2026-02-16 17:54:48.13424073 +0000 UTC m=+0.101842501 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, vcs-type=git, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=)
Feb 16 17:54:50 compute-0 podman[216833]: 2026-02-16 17:54:50.123241294 +0000 UTC m=+0.095593397 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:54:50 compute-0 nova_compute[186176]: 2026-02-16 17:54:50.139 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:52 compute-0 nova_compute[186176]: 2026-02-16 17:54:52.596 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:54 compute-0 ovn_controller[96437]: 2026-02-16T17:54:54Z|00234|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Feb 16 17:54:55 compute-0 nova_compute[186176]: 2026-02-16 17:54:55.142 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.659 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.660 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.675 186180 DEBUG nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.740 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.740 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.749 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.749 186180 INFO nova.compute.claims [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Claim successful on node compute-0.ctlplane.example.com
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.897 186180 DEBUG nova.compute.provider_tree [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.911 186180 DEBUG nova.scheduler.client.report [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.936 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:54:56 compute-0 nova_compute[186176]: 2026-02-16 17:54:56.937 186180 DEBUG nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.012 186180 DEBUG nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.012 186180 DEBUG nova.network.neutron [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.034 186180 INFO nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.058 186180 DEBUG nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 16 17:54:57 compute-0 podman[216853]: 2026-02-16 17:54:57.109377951 +0000 UTC m=+0.062823433 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:54:57 compute-0 podman[216852]: 2026-02-16 17:54:57.154778615 +0000 UTC m=+0.108666888 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.158 186180 DEBUG nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.159 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.159 186180 INFO nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Creating image(s)
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.160 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Acquiring lock "/var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.160 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "/var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.161 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "/var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.171 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.249 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.250 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Acquiring lock "34459df773b91356960ca90fb27335ee0115c646" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.250 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.260 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.316 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.317 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.339 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646,backing_fmt=raw /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk 1073741824" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.341 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "34459df773b91356960ca90fb27335ee0115c646" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.341 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.355 186180 DEBUG nova.policy [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '89c2f3c0467f4db98c90c15c7889ff05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd59fed88cd5a49aa81830c7075e99210', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.383 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646 --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.384 186180 DEBUG nova.virt.disk.api [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Checking if we can resize image /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.384 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.446 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.447 186180 DEBUG nova.virt.disk.api [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Cannot resize image /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.448 186180 DEBUG nova.objects.instance [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lazy-loading 'migration_context' on Instance uuid 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.475 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.476 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Ensure instance console log exists: /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.476 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.476 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.476 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:54:57 compute-0 nova_compute[186176]: 2026-02-16 17:54:57.598 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:58 compute-0 nova_compute[186176]: 2026-02-16 17:54:58.383 186180 DEBUG nova.network.neutron [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Successfully created port: 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.042 186180 DEBUG nova.network.neutron [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Successfully updated port: 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.062 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Acquiring lock "refresh_cache-5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.062 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Acquired lock "refresh_cache-5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.063 186180 DEBUG nova.network.neutron [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.151 186180 DEBUG nova.compute.manager [req-fcd40b29-4bd5-4652-8286-3c7fd54bfb87 req-0d3e290f-69c2-47ac-9bc6-5be9c5013526 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-changed-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.151 186180 DEBUG nova.compute.manager [req-fcd40b29-4bd5-4652-8286-3c7fd54bfb87 req-0d3e290f-69c2-47ac-9bc6-5be9c5013526 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Refreshing instance network info cache due to event network-changed-0b70b90a-4ed2-4fcb-af28-752acfd1efd5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.151 186180 DEBUG oslo_concurrency.lockutils [req-fcd40b29-4bd5-4652-8286-3c7fd54bfb87 req-0d3e290f-69c2-47ac-9bc6-5be9c5013526 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.239 186180 DEBUG nova.network.neutron [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:54:59 compute-0 podman[195505]: time="2026-02-16T17:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:54:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:54:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.917 186180 DEBUG nova.network.neutron [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Updating instance_info_cache with network_info: [{"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.937 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Releasing lock "refresh_cache-5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.937 186180 DEBUG nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Instance network_info: |[{"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.937 186180 DEBUG oslo_concurrency.lockutils [req-fcd40b29-4bd5-4652-8286-3c7fd54bfb87 req-0d3e290f-69c2-47ac-9bc6-5be9c5013526 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.938 186180 DEBUG nova.network.neutron [req-fcd40b29-4bd5-4652-8286-3c7fd54bfb87 req-0d3e290f-69c2-47ac-9bc6-5be9c5013526 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Refreshing network info cache for port 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.940 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Start _get_guest_xml network_info=[{"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'disk_bus': 'virtio', 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'encryption_secret_uuid': None, 'image_id': '7a81518d-a287-4a96-937c-188ae866c5b8'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.945 186180 WARNING nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.950 186180 DEBUG nova.virt.libvirt.host [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.951 186180 DEBUG nova.virt.libvirt.host [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.958 186180 DEBUG nova.virt.libvirt.host [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.959 186180 DEBUG nova.virt.libvirt.host [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.960 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.960 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-16T17:20:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='75ce9d90-876f-4652-a61c-f74d306b6692',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-16T17:20:51Z,direct_url=<?>,disk_format='qcow2',id=7a81518d-a287-4a96-937c-188ae866c5b8,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='1153d82e3c954635916cdffc75cdb267',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-16T17:20:54Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.960 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.961 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.961 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.961 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.961 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.961 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.962 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.962 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.962 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.962 186180 DEBUG nova.virt.hardware [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.965 186180 DEBUG nova.virt.libvirt.vif [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:54:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-2068269080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-2068269080',id=29,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d59fed88cd5a49aa81830c7075e99210',ramdisk_id='',reservation_id='r-zjmfmfzl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1931654466',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1931654466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:54:57Z,user_data=None,user_id='89c2f3c0467f4db98c90c15c7889ff05',uuid=5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.966 186180 DEBUG nova.network.os_vif_util [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Converting VIF {"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.966 186180 DEBUG nova.network.os_vif_util [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:e9:f8,bridge_name='br-int',has_traffic_filtering=True,id=0b70b90a-4ed2-4fcb-af28-752acfd1efd5,network=Network(00d22388-5e2e-4a6a-9b67-767982d87ea6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b70b90a-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.967 186180 DEBUG nova.objects.instance [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.981 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] End _get_guest_xml xml=<domain type="kvm">
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <uuid>5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34</uuid>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <name>instance-0000001d</name>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <memory>131072</memory>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <vcpu>1</vcpu>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <metadata>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-2068269080</nova:name>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <nova:creationTime>2026-02-16 17:54:59</nova:creationTime>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <nova:flavor name="m1.nano">
Feb 16 17:54:59 compute-0 nova_compute[186176]:         <nova:memory>128</nova:memory>
Feb 16 17:54:59 compute-0 nova_compute[186176]:         <nova:disk>1</nova:disk>
Feb 16 17:54:59 compute-0 nova_compute[186176]:         <nova:swap>0</nova:swap>
Feb 16 17:54:59 compute-0 nova_compute[186176]:         <nova:ephemeral>0</nova:ephemeral>
Feb 16 17:54:59 compute-0 nova_compute[186176]:         <nova:vcpus>1</nova:vcpus>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       </nova:flavor>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <nova:owner>
Feb 16 17:54:59 compute-0 nova_compute[186176]:         <nova:user uuid="89c2f3c0467f4db98c90c15c7889ff05">tempest-TestExecuteZoneMigrationStrategy-1931654466-project-member</nova:user>
Feb 16 17:54:59 compute-0 nova_compute[186176]:         <nova:project uuid="d59fed88cd5a49aa81830c7075e99210">tempest-TestExecuteZoneMigrationStrategy-1931654466</nova:project>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       </nova:owner>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <nova:root type="image" uuid="7a81518d-a287-4a96-937c-188ae866c5b8"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <nova:ports>
Feb 16 17:54:59 compute-0 nova_compute[186176]:         <nova:port uuid="0b70b90a-4ed2-4fcb-af28-752acfd1efd5">
Feb 16 17:54:59 compute-0 nova_compute[186176]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:         </nova:port>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       </nova:ports>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     </nova:instance>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   </metadata>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <sysinfo type="smbios">
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <system>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <entry name="manufacturer">RDO</entry>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <entry name="product">OpenStack Compute</entry>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <entry name="serial">5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34</entry>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <entry name="uuid">5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34</entry>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <entry name="family">Virtual Machine</entry>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     </system>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   </sysinfo>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <os>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <boot dev="hd"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <smbios mode="sysinfo"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   </os>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <features>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <acpi/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <apic/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <vmcoreinfo/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   </features>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <clock offset="utc">
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <timer name="pit" tickpolicy="delay"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <timer name="hpet" present="no"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   </clock>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <cpu mode="custom" match="exact">
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <model>Nehalem</model>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <topology sockets="1" cores="1" threads="1"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   </cpu>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   <devices>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <disk type="file" device="disk">
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <target dev="vda" bus="virtio"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <disk type="file" device="cdrom">
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <driver name="qemu" type="raw" cache="none"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <source file="/var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk.config"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <target dev="sda" bus="sata"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     </disk>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <interface type="ethernet">
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <mac address="fa:16:3e:38:e9:f8"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <mtu size="1442"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <target dev="tap0b70b90a-4e"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     </interface>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <serial type="pty">
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <log file="/var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/console.log" append="off"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     </serial>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <video>
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <model type="virtio"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     </video>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <input type="tablet" bus="usb"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <rng model="virtio">
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <backend model="random">/dev/urandom</backend>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     </rng>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="pci" model="pcie-root-port"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <controller type="usb" index="0"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     <memballoon model="virtio">
Feb 16 17:54:59 compute-0 nova_compute[186176]:       <stats period="10"/>
Feb 16 17:54:59 compute-0 nova_compute[186176]:     </memballoon>
Feb 16 17:54:59 compute-0 nova_compute[186176]:   </devices>
Feb 16 17:54:59 compute-0 nova_compute[186176]: </domain>
Feb 16 17:54:59 compute-0 nova_compute[186176]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.982 186180 DEBUG nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Preparing to wait for external event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.982 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.982 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.982 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.983 186180 DEBUG nova.virt.libvirt.vif [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-16T17:54:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-2068269080',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-2068269080',id=29,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d59fed88cd5a49aa81830c7075e99210',ramdisk_id='',reservation_id='r-zjmfmfzl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1931654466',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1931654466-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-16T17:54:57Z,user_data=None,user_id='89c2f3c0467f4db98c90c15c7889ff05',uuid=5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.983 186180 DEBUG nova.network.os_vif_util [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Converting VIF {"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.984 186180 DEBUG nova.network.os_vif_util [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:e9:f8,bridge_name='br-int',has_traffic_filtering=True,id=0b70b90a-4ed2-4fcb-af28-752acfd1efd5,network=Network(00d22388-5e2e-4a6a-9b67-767982d87ea6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b70b90a-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.984 186180 DEBUG os_vif [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:e9:f8,bridge_name='br-int',has_traffic_filtering=True,id=0b70b90a-4ed2-4fcb-af28-752acfd1efd5,network=Network(00d22388-5e2e-4a6a-9b67-767982d87ea6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b70b90a-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.984 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.985 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.985 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.987 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.988 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b70b90a-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.988 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b70b90a-4e, col_values=(('external_ids', {'iface-id': '0b70b90a-4ed2-4fcb-af28-752acfd1efd5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:e9:f8', 'vm-uuid': '5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:54:59 compute-0 NetworkManager[56463]: <info>  [1771264499.9907] manager: (tap0b70b90a-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.993 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:54:59 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.996 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:54:59.998 186180 INFO os_vif [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:e9:f8,bridge_name='br-int',has_traffic_filtering=True,id=0b70b90a-4ed2-4fcb-af28-752acfd1efd5,network=Network(00d22388-5e2e-4a6a-9b67-767982d87ea6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b70b90a-4e')
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.057 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.058 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.058 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] No VIF found with MAC fa:16:3e:38:e9:f8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.060 186180 INFO nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Using config drive
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.144 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.336 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.336 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.337 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.457 186180 INFO nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Creating config drive at /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk.config
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.464 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6cc94y6y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.594 186180 DEBUG oslo_concurrency.processutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp6cc94y6y" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:55:00 compute-0 kernel: tap0b70b90a-4e: entered promiscuous mode
Feb 16 17:55:00 compute-0 ovn_controller[96437]: 2026-02-16T17:55:00Z|00235|binding|INFO|Claiming lport 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 for this chassis.
Feb 16 17:55:00 compute-0 ovn_controller[96437]: 2026-02-16T17:55:00Z|00236|binding|INFO|0b70b90a-4ed2-4fcb-af28-752acfd1efd5: Claiming fa:16:3e:38:e9:f8 10.100.0.3
Feb 16 17:55:00 compute-0 NetworkManager[56463]: <info>  [1771264500.6778] manager: (tap0b70b90a-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.675 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.683 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.692 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:e9:f8 10.100.0.3'], port_security=['fa:16:3e:38:e9:f8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00d22388-5e2e-4a6a-9b67-767982d87ea6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd59fed88cd5a49aa81830c7075e99210', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0eed1575-bfa4-4c1e-b8f7-ee9038d09149', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d0900804-e19d-4f68-ba32-1c31a7899783, chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=0b70b90a-4ed2-4fcb-af28-752acfd1efd5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.694 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 in datapath 00d22388-5e2e-4a6a-9b67-767982d87ea6 bound to our chassis
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.696 105730 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00d22388-5e2e-4a6a-9b67-767982d87ea6
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.709 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.710 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e9da358f-1e26-4a43-91bb-04ad0c703c2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.711 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00d22388-51 in ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 16 17:55:00 compute-0 systemd-udevd[216937]: Network interface NamePolicy= disabled on kernel command line.
Feb 16 17:55:00 compute-0 ovn_controller[96437]: 2026-02-16T17:55:00Z|00237|binding|INFO|Setting lport 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 ovn-installed in OVS
Feb 16 17:55:00 compute-0 ovn_controller[96437]: 2026-02-16T17:55:00Z|00238|binding|INFO|Setting lport 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 up in Southbound
Feb 16 17:55:00 compute-0 nova_compute[186176]: 2026-02-16 17:55:00.715 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.715 206858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00d22388-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.715 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[68cfbcf4-7190-4cd3-8047-6cab36c044ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 systemd-machined[155631]: New machine qemu-22-instance-0000001d.
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.717 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[512fa15c-ff80-46c5-84ac-29826dc7d5ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 NetworkManager[56463]: <info>  [1771264500.7309] device (tap0b70b90a-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 16 17:55:00 compute-0 NetworkManager[56463]: <info>  [1771264500.7320] device (tap0b70b90a-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.732 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[5f585204-ac5f-47c1-bd81-2b7ebf15e1ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001d.
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.749 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[b1d7eed0-3af4-4e33-8765-7834066f79c5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.784 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[b989bf75-8c05-4352-b3d3-311f6f950f51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 NetworkManager[56463]: <info>  [1771264500.7930] manager: (tap00d22388-50): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.792 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[77c531f1-8212-48fc-b85a-6b6b39d17e24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.833 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[94c7f153-a036-4bcf-bf80-572950ecd47e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.837 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc6151a-d66f-47f9-aa57-a44b423ba515]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 NetworkManager[56463]: <info>  [1771264500.8662] device (tap00d22388-50): carrier: link connected
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.870 206872 DEBUG oslo.privsep.daemon [-] privsep: reply[656c5257-4e6e-41da-9375-cb1e5d4607c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.892 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[69317bce-3cc0-44a0-b0ad-c9cee301af77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00d22388-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:3a:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609639, 'reachable_time': 35509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216969, 'error': None, 'target': 'ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.909 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[2f02a3d6-fda4-4525-b2ff-bd84fc0d126e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3e:3a23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 609639, 'tstamp': 609639}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216970, 'error': None, 'target': 'ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.932 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[50f177e8-4572-449b-8884-68f3e5d08cef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00d22388-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3e:3a:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609639, 'reachable_time': 35509, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216971, 'error': None, 'target': 'ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:00 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:00.970 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[fecf0dfb-5723-44c2-9faa-db52813bc5db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:01.055 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[e61a7f3e-c390-46dd-90dc-bbd7249b866a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:01.058 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00d22388-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:01.059 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:01.060 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00d22388-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.108 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:01 compute-0 kernel: tap00d22388-50: entered promiscuous mode
Feb 16 17:55:01 compute-0 NetworkManager[56463]: <info>  [1771264501.1120] manager: (tap00d22388-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.112 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:01 compute-0 rsyslogd[1020]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:01.115 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00d22388-50, col_values=(('external_ids', {'iface-id': 'fb954daa-5bfa-4d65-901d-7e41870bdde6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.117 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:01 compute-0 ovn_controller[96437]: 2026-02-16T17:55:01Z|00239|binding|INFO|Releasing lport fb954daa-5bfa-4d65-901d-7e41870bdde6 from this chassis (sb_readonly=0)
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.118 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:01.120 105730 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00d22388-5e2e-4a6a-9b67-767982d87ea6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00d22388-5e2e-4a6a-9b67-767982d87ea6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:01.121 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec71972-f658-4bad-b2a9-3d4fc418d43c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:01.123 105730 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: global
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     log         /dev/log local0 debug
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     log-tag     haproxy-metadata-proxy-00d22388-5e2e-4a6a-9b67-767982d87ea6
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     user        root
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     group       root
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     maxconn     1024
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     pidfile     /var/lib/neutron/external/pids/00d22388-5e2e-4a6a-9b67-767982d87ea6.pid.haproxy
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     daemon
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: defaults
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     log global
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     mode http
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     option httplog
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     option dontlognull
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     option http-server-close
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     option forwardfor
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     retries                 3
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     timeout http-request    30s
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     timeout connect         30s
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     timeout client          32s
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     timeout server          32s
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     timeout http-keep-alive 30s
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: listen listener
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     bind 169.254.169.254:80
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     server metadata /var/lib/neutron/metadata_proxy
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:     http-request add-header X-OVN-Network-ID 00d22388-5e2e-4a6a-9b67-767982d87ea6
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 16 17:55:01 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:01.124 105730 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6', 'env', 'PROCESS_TAG=haproxy-00d22388-5e2e-4a6a-9b67-767982d87ea6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00d22388-5e2e-4a6a-9b67-767982d87ea6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.125 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.133 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264501.1330905, 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.134 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] VM Started (Lifecycle Event)
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.155 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.162 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264501.1365657, 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.162 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] VM Paused (Lifecycle Event)
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.185 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.190 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.208 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.233 186180 DEBUG nova.compute.manager [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.233 186180 DEBUG oslo_concurrency.lockutils [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.234 186180 DEBUG oslo_concurrency.lockutils [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.234 186180 DEBUG oslo_concurrency.lockutils [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.235 186180 DEBUG nova.compute.manager [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Processing event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.235 186180 DEBUG nova.compute.manager [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.235 186180 DEBUG oslo_concurrency.lockutils [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.235 186180 DEBUG oslo_concurrency.lockutils [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.236 186180 DEBUG oslo_concurrency.lockutils [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.236 186180 DEBUG nova.compute.manager [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] No waiting events found dispatching network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.236 186180 WARNING nova.compute.manager [req-3dda3efb-cb62-4f92-8374-ee1477246bf3 req-55e85e04-a357-4bd9-ad56-cd8499e2c401 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received unexpected event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 for instance with vm_state building and task_state spawning.
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.238 186180 DEBUG nova.network.neutron [req-fcd40b29-4bd5-4652-8286-3c7fd54bfb87 req-0d3e290f-69c2-47ac-9bc6-5be9c5013526 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Updated VIF entry in instance network info cache for port 0b70b90a-4ed2-4fcb-af28-752acfd1efd5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.238 186180 DEBUG nova.network.neutron [req-fcd40b29-4bd5-4652-8286-3c7fd54bfb87 req-0d3e290f-69c2-47ac-9bc6-5be9c5013526 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Updating instance_info_cache with network_info: [{"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.240 186180 DEBUG nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.246 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264501.245831, 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.246 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] VM Resumed (Lifecycle Event)
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.249 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.254 186180 DEBUG oslo_concurrency.lockutils [req-fcd40b29-4bd5-4652-8286-3c7fd54bfb87 req-0d3e290f-69c2-47ac-9bc6-5be9c5013526 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.255 186180 INFO nova.virt.libvirt.driver [-] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Instance spawned successfully.
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.255 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.264 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.272 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.278 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.279 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.279 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.280 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.281 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.281 186180 DEBUG nova.virt.libvirt.driver [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.292 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.316 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.345 186180 INFO nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Took 4.19 seconds to spawn the instance on the hypervisor.
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.346 186180 DEBUG nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:55:01 compute-0 openstack_network_exporter[198360]: ERROR   17:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:55:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:55:01 compute-0 openstack_network_exporter[198360]: ERROR   17:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:55:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.422 186180 INFO nova.compute.manager [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Took 4.71 seconds to build instance.
Feb 16 17:55:01 compute-0 podman[217010]: 2026-02-16 17:55:01.51517274 +0000 UTC m=+0.053088414 container create cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 16 17:55:01 compute-0 systemd[1]: Started libpod-conmon-cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406.scope.
Feb 16 17:55:01 compute-0 podman[217010]: 2026-02-16 17:55:01.490025632 +0000 UTC m=+0.027941306 image pull 19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 16 17:55:01 compute-0 systemd[1]: Started libcrun container.
Feb 16 17:55:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62479eda639d9e6019091944fc47b63ec627a3414852e2fdc44685fac6dd4b56/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 16 17:55:01 compute-0 podman[217010]: 2026-02-16 17:55:01.621575171 +0000 UTC m=+0.159490895 container init cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 16 17:55:01 compute-0 nova_compute[186176]: 2026-02-16 17:55:01.624 186180 DEBUG oslo_concurrency.lockutils [None req-cedeec1f-b798-49e3-bae0-92a3f416f7a7 89c2f3c0467f4db98c90c15c7889ff05 d59fed88cd5a49aa81830c7075e99210 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:01 compute-0 podman[217010]: 2026-02-16 17:55:01.629139487 +0000 UTC m=+0.167055171 container start cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 17:55:01 compute-0 neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6[217025]: [NOTICE]   (217031) : New worker (217033) forked
Feb 16 17:55:01 compute-0 neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6[217025]: [NOTICE]   (217031) : Loading success.
Feb 16 17:55:02 compute-0 nova_compute[186176]: 2026-02-16 17:55:02.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.344 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.345 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.345 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.346 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.428 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.473 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.474 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.518 186180 DEBUG oslo_concurrency.processutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.688 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.690 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5615MB free_disk=73.22172927856445GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.690 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.691 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.757 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Instance 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.758 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.758 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.963 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.983 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.984 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.991 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:04 compute-0 nova_compute[186176]: 2026-02-16 17:55:04.998 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 17:55:05 compute-0 nova_compute[186176]: 2026-02-16 17:55:05.025 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 17:55:05 compute-0 nova_compute[186176]: 2026-02-16 17:55:05.070 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:55:05 compute-0 nova_compute[186176]: 2026-02-16 17:55:05.088 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:55:05 compute-0 nova_compute[186176]: 2026-02-16 17:55:05.116 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:55:05 compute-0 nova_compute[186176]: 2026-02-16 17:55:05.116 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:05 compute-0 nova_compute[186176]: 2026-02-16 17:55:05.145 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:07 compute-0 nova_compute[186176]: 2026-02-16 17:55:07.113 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:55:07 compute-0 nova_compute[186176]: 2026-02-16 17:55:07.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:55:09 compute-0 nova_compute[186176]: 2026-02-16 17:55:09.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:55:09 compute-0 nova_compute[186176]: 2026-02-16 17:55:09.995 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:10 compute-0 nova_compute[186176]: 2026-02-16 17:55:10.147 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:11 compute-0 nova_compute[186176]: 2026-02-16 17:55:11.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:55:13 compute-0 ovn_controller[96437]: 2026-02-16T17:55:13Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:e9:f8 10.100.0.3
Feb 16 17:55:13 compute-0 ovn_controller[96437]: 2026-02-16T17:55:13Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:e9:f8 10.100.0.3
Feb 16 17:55:15 compute-0 nova_compute[186176]: 2026-02-16 17:55:14.999 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:15 compute-0 nova_compute[186176]: 2026-02-16 17:55:15.149 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:19 compute-0 podman[217066]: 2026-02-16 17:55:19.127839994 +0000 UTC m=+0.087657992 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64)
Feb 16 17:55:20 compute-0 nova_compute[186176]: 2026-02-16 17:55:20.007 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:20 compute-0 nova_compute[186176]: 2026-02-16 17:55:20.152 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:21 compute-0 podman[217089]: 2026-02-16 17:55:21.112740117 +0000 UTC m=+0.076534129 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 16 17:55:25 compute-0 nova_compute[186176]: 2026-02-16 17:55:25.010 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:25 compute-0 nova_compute[186176]: 2026-02-16 17:55:25.154 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:28 compute-0 podman[217111]: 2026-02-16 17:55:28.100718958 +0000 UTC m=+0.069418274 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:55:28 compute-0 podman[217110]: 2026-02-16 17:55:28.274386241 +0000 UTC m=+0.239781726 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127)
Feb 16 17:55:29 compute-0 podman[195505]: time="2026-02-16T17:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:55:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17245 "" "Go-http-client/1.1"
Feb 16 17:55:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2646 "" "Go-http-client/1.1"
Feb 16 17:55:30 compute-0 nova_compute[186176]: 2026-02-16 17:55:30.015 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:30 compute-0 nova_compute[186176]: 2026-02-16 17:55:30.157 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:31 compute-0 ovn_controller[96437]: 2026-02-16T17:55:31Z|00240|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Feb 16 17:55:31 compute-0 openstack_network_exporter[198360]: ERROR   17:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:55:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:55:31 compute-0 openstack_network_exporter[198360]: ERROR   17:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:55:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:55:31 compute-0 nova_compute[186176]: 2026-02-16 17:55:31.445 186180 DEBUG nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Check if temp file /var/lib/nova/instances/tmpytv4iqsi exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 16 17:55:31 compute-0 nova_compute[186176]: 2026-02-16 17:55:31.446 186180 DEBUG nova.compute.manager [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpytv4iqsi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 16 17:55:32 compute-0 nova_compute[186176]: 2026-02-16 17:55:32.915 186180 DEBUG oslo_concurrency.processutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:55:32 compute-0 nova_compute[186176]: 2026-02-16 17:55:32.997 186180 DEBUG oslo_concurrency.processutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:55:32 compute-0 nova_compute[186176]: 2026-02-16 17:55:32.999 186180 DEBUG oslo_concurrency.processutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 17:55:33 compute-0 nova_compute[186176]: 2026-02-16 17:55:33.057 186180 DEBUG oslo_concurrency.processutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 17:55:35 compute-0 nova_compute[186176]: 2026-02-16 17:55:35.018 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:35 compute-0 nova_compute[186176]: 2026-02-16 17:55:35.160 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:37 compute-0 sshd-session[217166]: Accepted publickey for nova from 192.168.122.101 port 39622 ssh2: ECDSA SHA256:9MH41QlXXBTBEUO+frglPDA4tL649dgNzsa+zO9IAZ4
Feb 16 17:55:37 compute-0 systemd-logind[821]: New session 45 of user nova.
Feb 16 17:55:37 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Feb 16 17:55:37 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 16 17:55:37 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 16 17:55:37 compute-0 systemd[1]: Starting User Manager for UID 42436...
Feb 16 17:55:37 compute-0 systemd[217170]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:55:37 compute-0 systemd[217170]: Queued start job for default target Main User Target.
Feb 16 17:55:37 compute-0 systemd[217170]: Created slice User Application Slice.
Feb 16 17:55:37 compute-0 systemd[217170]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:55:37 compute-0 systemd[217170]: Started Daily Cleanup of User's Temporary Directories.
Feb 16 17:55:37 compute-0 systemd[217170]: Reached target Paths.
Feb 16 17:55:37 compute-0 systemd[217170]: Reached target Timers.
Feb 16 17:55:37 compute-0 systemd[217170]: Starting D-Bus User Message Bus Socket...
Feb 16 17:55:37 compute-0 systemd[217170]: Starting Create User's Volatile Files and Directories...
Feb 16 17:55:37 compute-0 systemd[217170]: Listening on D-Bus User Message Bus Socket.
Feb 16 17:55:37 compute-0 systemd[217170]: Reached target Sockets.
Feb 16 17:55:37 compute-0 systemd[217170]: Finished Create User's Volatile Files and Directories.
Feb 16 17:55:37 compute-0 systemd[217170]: Reached target Basic System.
Feb 16 17:55:37 compute-0 systemd[217170]: Reached target Main User Target.
Feb 16 17:55:37 compute-0 systemd[217170]: Startup finished in 160ms.
Feb 16 17:55:37 compute-0 systemd[1]: Started User Manager for UID 42436.
Feb 16 17:55:37 compute-0 systemd[1]: Started Session 45 of User nova.
Feb 16 17:55:37 compute-0 sshd-session[217166]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Feb 16 17:55:37 compute-0 sshd-session[217185]: Received disconnect from 192.168.122.101 port 39622:11: disconnected by user
Feb 16 17:55:37 compute-0 sshd-session[217185]: Disconnected from user nova 192.168.122.101 port 39622
Feb 16 17:55:37 compute-0 sshd-session[217166]: pam_unix(sshd:session): session closed for user nova
Feb 16 17:55:37 compute-0 systemd[1]: session-45.scope: Deactivated successfully.
Feb 16 17:55:37 compute-0 systemd-logind[821]: Session 45 logged out. Waiting for processes to exit.
Feb 16 17:55:37 compute-0 systemd-logind[821]: Removed session 45.
Feb 16 17:55:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:38.190 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:38.192 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:38.193 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:39 compute-0 nova_compute[186176]: 2026-02-16 17:55:39.963 186180 DEBUG nova.compute.manager [req-e626e4e0-7232-4810-bda7-79a5c0256ba7 req-b803b6c2-b839-4807-92d2-d46b4d516439 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-unplugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:39 compute-0 nova_compute[186176]: 2026-02-16 17:55:39.964 186180 DEBUG oslo_concurrency.lockutils [req-e626e4e0-7232-4810-bda7-79a5c0256ba7 req-b803b6c2-b839-4807-92d2-d46b4d516439 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:39 compute-0 nova_compute[186176]: 2026-02-16 17:55:39.965 186180 DEBUG oslo_concurrency.lockutils [req-e626e4e0-7232-4810-bda7-79a5c0256ba7 req-b803b6c2-b839-4807-92d2-d46b4d516439 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:39 compute-0 nova_compute[186176]: 2026-02-16 17:55:39.965 186180 DEBUG oslo_concurrency.lockutils [req-e626e4e0-7232-4810-bda7-79a5c0256ba7 req-b803b6c2-b839-4807-92d2-d46b4d516439 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:39 compute-0 nova_compute[186176]: 2026-02-16 17:55:39.965 186180 DEBUG nova.compute.manager [req-e626e4e0-7232-4810-bda7-79a5c0256ba7 req-b803b6c2-b839-4807-92d2-d46b4d516439 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] No waiting events found dispatching network-vif-unplugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:55:39 compute-0 nova_compute[186176]: 2026-02-16 17:55:39.966 186180 DEBUG nova.compute.manager [req-e626e4e0-7232-4810-bda7-79a5c0256ba7 req-b803b6c2-b839-4807-92d2-d46b4d516439 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-unplugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:55:39 compute-0 nova_compute[186176]: 2026-02-16 17:55:39.976 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:39.976 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:55:39 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:39.978 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 17:55:40 compute-0 nova_compute[186176]: 2026-02-16 17:55:40.020 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:40 compute-0 nova_compute[186176]: 2026-02-16 17:55:40.162 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:40 compute-0 nova_compute[186176]: 2026-02-16 17:55:40.978 186180 INFO nova.compute.manager [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Took 7.92 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Feb 16 17:55:40 compute-0 nova_compute[186176]: 2026-02-16 17:55:40.980 186180 DEBUG nova.compute.manager [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.000 186180 DEBUG nova.compute.manager [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpytv4iqsi',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(9572d8b7-e9ed-42a2-bd53-4cdb91ae8c96),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.026 186180 DEBUG nova.objects.instance [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lazy-loading 'migration_context' on Instance uuid 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.027 186180 DEBUG nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.029 186180 DEBUG nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.029 186180 DEBUG nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.053 186180 DEBUG nova.virt.libvirt.vif [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:54:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-2068269080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-2068269080',id=29,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:55:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d59fed88cd5a49aa81830c7075e99210',ramdisk_id='',reservation_id='r-zjmfmfzl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1931654466',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1931654466-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:55:01Z,user_data=None,user_id='89c2f3c0467f4db98c90c15c7889ff05',uuid=5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.053 186180 DEBUG nova.network.os_vif_util [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.054 186180 DEBUG nova.network.os_vif_util [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:e9:f8,bridge_name='br-int',has_traffic_filtering=True,id=0b70b90a-4ed2-4fcb-af28-752acfd1efd5,network=Network(00d22388-5e2e-4a6a-9b67-767982d87ea6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b70b90a-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.055 186180 DEBUG nova.virt.libvirt.migration [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Updating guest XML with vif config: <interface type="ethernet">
Feb 16 17:55:41 compute-0 nova_compute[186176]:   <mac address="fa:16:3e:38:e9:f8"/>
Feb 16 17:55:41 compute-0 nova_compute[186176]:   <model type="virtio"/>
Feb 16 17:55:41 compute-0 nova_compute[186176]:   <driver name="vhost" rx_queue_size="512"/>
Feb 16 17:55:41 compute-0 nova_compute[186176]:   <mtu size="1442"/>
Feb 16 17:55:41 compute-0 nova_compute[186176]:   <target dev="tap0b70b90a-4e"/>
Feb 16 17:55:41 compute-0 nova_compute[186176]: </interface>
Feb 16 17:55:41 compute-0 nova_compute[186176]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.056 186180 DEBUG nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.532 186180 DEBUG nova.virt.libvirt.migration [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 16 17:55:41 compute-0 nova_compute[186176]: 2026-02-16 17:55:41.533 186180 INFO nova.virt.libvirt.migration [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.445 186180 DEBUG nova.virt.driver [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] Emitting event <LifecycleEvent: 1771264543.4453588, 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.446 186180 INFO nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] VM Paused (Lifecycle Event)
Feb 16 17:55:43 compute-0 kernel: tap0b70b90a-4e (unregistering): left promiscuous mode
Feb 16 17:55:43 compute-0 NetworkManager[56463]: <info>  [1771264543.5980] device (tap0b70b90a-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 16 17:55:43 compute-0 ovn_controller[96437]: 2026-02-16T17:55:43Z|00241|binding|INFO|Releasing lport 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 from this chassis (sb_readonly=0)
Feb 16 17:55:43 compute-0 ovn_controller[96437]: 2026-02-16T17:55:43Z|00242|binding|INFO|Setting lport 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 down in Southbound
Feb 16 17:55:43 compute-0 ovn_controller[96437]: 2026-02-16T17:55:43Z|00243|binding|INFO|Removing iface tap0b70b90a-4e ovn-installed in OVS
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.608 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.623 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:43 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Feb 16 17:55:43 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001d.scope: Consumed 13.130s CPU time.
Feb 16 17:55:43 compute-0 systemd-machined[155631]: Machine qemu-22-instance-0000001d terminated.
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.846 186180 DEBUG nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.846 186180 DEBUG nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.847 186180 DEBUG nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.906 186180 DEBUG nova.compute.manager [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.907 186180 DEBUG oslo_concurrency.lockutils [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.907 186180 DEBUG oslo_concurrency.lockutils [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.907 186180 DEBUG oslo_concurrency.lockutils [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.908 186180 DEBUG nova.compute.manager [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] No waiting events found dispatching network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.908 186180 WARNING nova.compute.manager [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received unexpected event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 for instance with vm_state active and task_state migrating.
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.909 186180 DEBUG nova.compute.manager [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-changed-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.909 186180 DEBUG nova.compute.manager [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Refreshing instance network info cache due to event network-changed-0b70b90a-4ed2-4fcb-af28-752acfd1efd5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.909 186180 DEBUG oslo_concurrency.lockutils [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "refresh_cache-5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.910 186180 DEBUG oslo_concurrency.lockutils [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquired lock "refresh_cache-5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.910 186180 DEBUG nova.network.neutron [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Refreshing network info cache for port 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 16 17:55:43 compute-0 ovn_controller[96437]: 2026-02-16T17:55:43Z|00244|binding|INFO|Releasing lport fb954daa-5bfa-4d65-901d-7e41870bdde6 from this chassis (sb_readonly=0)
Feb 16 17:55:43 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:43.913 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:e9:f8 10.100.0.3'], port_security=['fa:16:3e:38:e9:f8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '2e3a84a9-c1b4-4b1e-92e3-57d0875592cc'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00d22388-5e2e-4a6a-9b67-767982d87ea6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd59fed88cd5a49aa81830c7075e99210', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0eed1575-bfa4-4c1e-b8f7-ee9038d09149', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d0900804-e19d-4f68-ba32-1c31a7899783, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>], logical_port=0b70b90a-4ed2-4fcb-af28-752acfd1efd5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f415cd638e0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 17:55:43 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:43.915 105730 INFO neutron.agent.ovn.metadata.agent [-] Port 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 in datapath 00d22388-5e2e-4a6a-9b67-767982d87ea6 unbound from our chassis
Feb 16 17:55:43 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:43.916 105730 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00d22388-5e2e-4a6a-9b67-767982d87ea6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 16 17:55:43 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:43.918 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[3a66c47a-32f9-4f6c-bc68-6d3ec5c19655]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:43 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:43.918 105730 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6 namespace which is not needed anymore
Feb 16 17:55:43 compute-0 nova_compute[186176]: 2026-02-16 17:55:43.938 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.050 186180 DEBUG nova.compute.manager [None req-2c32de00-a201-41f9-89c0-f3de28c77990 - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:55:44 compute-0 neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6[217025]: [NOTICE]   (217031) : haproxy version is 2.8.14-c23fe91
Feb 16 17:55:44 compute-0 neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6[217025]: [NOTICE]   (217031) : path to executable is /usr/sbin/haproxy
Feb 16 17:55:44 compute-0 neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6[217025]: [ALERT]    (217031) : Current worker (217033) exited with code 143 (Terminated)
Feb 16 17:55:44 compute-0 neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6[217025]: [WARNING]  (217031) : All workers exited. Exiting... (0)
Feb 16 17:55:44 compute-0 systemd[1]: libpod-cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406.scope: Deactivated successfully.
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.086 186180 INFO nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 16 17:55:44 compute-0 podman[217245]: 2026-02-16 17:55:44.095561697 +0000 UTC m=+0.061182772 container died cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 16 17:55:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406-userdata-shm.mount: Deactivated successfully.
Feb 16 17:55:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-62479eda639d9e6019091944fc47b63ec627a3414852e2fdc44685fac6dd4b56-merged.mount: Deactivated successfully.
Feb 16 17:55:44 compute-0 podman[217245]: 2026-02-16 17:55:44.134242087 +0000 UTC m=+0.099863142 container cleanup cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 17:55:44 compute-0 systemd[1]: libpod-conmon-cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406.scope: Deactivated successfully.
Feb 16 17:55:44 compute-0 podman[217277]: 2026-02-16 17:55:44.209606076 +0000 UTC m=+0.049091315 container remove cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 17:55:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:44.214 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[d30acfb0-3db5-49af-9656-97ca3cea66e4]: (4, ('Mon Feb 16 05:55:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6 (cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406)\ncfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406\nMon Feb 16 05:55:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6 (cfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406)\ncfa833e3be96595f35d39628971e3828d86b288223011b071bdd2bb4c4f11406\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:44.215 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[afdaeb8f-7c90-4d86-ba05-94386b9c8780]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:44.216 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00d22388-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.218 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:44 compute-0 kernel: tap00d22388-50: left promiscuous mode
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.228 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:44.233 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[1b8d5af2-dce3-4648-9af0-470264f0c034]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:44.247 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e38a0a-4c4c-4e8f-afc6-6758d959888d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:44.249 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf5eaa9-7276-47dd-bb43-a5b3481dfde1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:44.263 206858 DEBUG oslo.privsep.daemon [-] privsep: reply[06e90839-2157-4781-948a-8a855eac91b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 609630, 'reachable_time': 23384, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217296, 'error': None, 'target': 'ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:44 compute-0 systemd[1]: run-netns-ovnmeta\x2d00d22388\x2d5e2e\x2d4a6a\x2d9b67\x2d767982d87ea6.mount: Deactivated successfully.
Feb 16 17:55:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:44.268 106250 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00d22388-5e2e-4a6a-9b67-767982d87ea6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 16 17:55:44 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:44.269 106250 DEBUG oslo.privsep.daemon [-] privsep: reply[897ec165-dc7e-4574-9d48-26133cd048a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.589 186180 DEBUG nova.virt.libvirt.guest [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34' (instance-0000001d) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.590 186180 INFO nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Migration operation has completed
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.590 186180 INFO nova.compute.manager [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] _post_live_migration() is started..
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.640 186180 DEBUG nova.compute.manager [req-eee29843-ee23-4d1d-9047-2c1a74c207a1 req-409e7f42-ab5c-4829-a6c2-b10c4f175e3d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-unplugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.641 186180 DEBUG oslo_concurrency.lockutils [req-eee29843-ee23-4d1d-9047-2c1a74c207a1 req-409e7f42-ab5c-4829-a6c2-b10c4f175e3d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.641 186180 DEBUG oslo_concurrency.lockutils [req-eee29843-ee23-4d1d-9047-2c1a74c207a1 req-409e7f42-ab5c-4829-a6c2-b10c4f175e3d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.642 186180 DEBUG oslo_concurrency.lockutils [req-eee29843-ee23-4d1d-9047-2c1a74c207a1 req-409e7f42-ab5c-4829-a6c2-b10c4f175e3d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.642 186180 DEBUG nova.compute.manager [req-eee29843-ee23-4d1d-9047-2c1a74c207a1 req-409e7f42-ab5c-4829-a6c2-b10c4f175e3d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] No waiting events found dispatching network-vif-unplugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:55:44 compute-0 nova_compute[186176]: 2026-02-16 17:55:44.643 186180 DEBUG nova.compute.manager [req-eee29843-ee23-4d1d-9047-2c1a74c207a1 req-409e7f42-ab5c-4829-a6c2-b10c4f175e3d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-unplugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.022 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.165 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.747 186180 DEBUG nova.network.neutron [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Activated binding for port 0b70b90a-4ed2-4fcb-af28-752acfd1efd5 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.749 186180 DEBUG nova.compute.manager [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.750 186180 DEBUG nova.virt.libvirt.vif [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-16T17:54:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-2068269080',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-2068269080',id=29,image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-16T17:55:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d59fed88cd5a49aa81830c7075e99210',ramdisk_id='',reservation_id='r-zjmfmfzl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='7a81518d-a287-4a96-937c-188ae866c5b8',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-1931654466',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-1931654466-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-16T17:55:28Z,user_data=None,user_id='89c2f3c0467f4db98c90c15c7889ff05',uuid=5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.751 186180 DEBUG nova.network.os_vif_util [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converting VIF {"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.753 186180 DEBUG nova.network.os_vif_util [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:e9:f8,bridge_name='br-int',has_traffic_filtering=True,id=0b70b90a-4ed2-4fcb-af28-752acfd1efd5,network=Network(00d22388-5e2e-4a6a-9b67-767982d87ea6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b70b90a-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.754 186180 DEBUG os_vif [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:e9:f8,bridge_name='br-int',has_traffic_filtering=True,id=0b70b90a-4ed2-4fcb-af28-752acfd1efd5,network=Network(00d22388-5e2e-4a6a-9b67-767982d87ea6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b70b90a-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.756 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.757 186180 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b70b90a-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.788 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.792 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.796 186180 INFO os_vif [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:e9:f8,bridge_name='br-int',has_traffic_filtering=True,id=0b70b90a-4ed2-4fcb-af28-752acfd1efd5,network=Network(00d22388-5e2e-4a6a-9b67-767982d87ea6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b70b90a-4e')
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.797 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.797 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.798 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.798 186180 DEBUG nova.compute.manager [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.799 186180 INFO nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Deleting instance files /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34_del
Feb 16 17:55:45 compute-0 nova_compute[186176]: 2026-02-16 17:55:45.800 186180 INFO nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Deletion of /var/lib/nova/instances/5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34_del complete
Feb 16 17:55:45 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:55:45.981 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.240 186180 DEBUG nova.compute.manager [req-c3ada49f-01ad-4aa6-bcb6-cffeb9f7e324 req-3d13120f-ee9e-49c5-a677-1e9dd9d78c1b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-unplugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.240 186180 DEBUG oslo_concurrency.lockutils [req-c3ada49f-01ad-4aa6-bcb6-cffeb9f7e324 req-3d13120f-ee9e-49c5-a677-1e9dd9d78c1b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.241 186180 DEBUG oslo_concurrency.lockutils [req-c3ada49f-01ad-4aa6-bcb6-cffeb9f7e324 req-3d13120f-ee9e-49c5-a677-1e9dd9d78c1b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.241 186180 DEBUG oslo_concurrency.lockutils [req-c3ada49f-01ad-4aa6-bcb6-cffeb9f7e324 req-3d13120f-ee9e-49c5-a677-1e9dd9d78c1b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.241 186180 DEBUG nova.compute.manager [req-c3ada49f-01ad-4aa6-bcb6-cffeb9f7e324 req-3d13120f-ee9e-49c5-a677-1e9dd9d78c1b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] No waiting events found dispatching network-vif-unplugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.242 186180 DEBUG nova.compute.manager [req-c3ada49f-01ad-4aa6-bcb6-cffeb9f7e324 req-3d13120f-ee9e-49c5-a677-1e9dd9d78c1b 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-unplugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.420 186180 DEBUG nova.network.neutron [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Updated VIF entry in instance network info cache for port 0b70b90a-4ed2-4fcb-af28-752acfd1efd5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.421 186180 DEBUG nova.network.neutron [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Updating instance_info_cache with network_info: [{"id": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "address": "fa:16:3e:38:e9:f8", "network": {"id": "00d22388-5e2e-4a6a-9b67-767982d87ea6", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-2072550866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59fed88cd5a49aa81830c7075e99210", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b70b90a-4e", "ovs_interfaceid": "0b70b90a-4ed2-4fcb-af28-752acfd1efd5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.455 186180 DEBUG oslo_concurrency.lockutils [req-d1245bf2-938d-4155-a785-4cc7093d869b req-819c8de2-1725-47e5-b77b-7db43e61ece3 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Releasing lock "refresh_cache-5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.738 186180 DEBUG nova.compute.manager [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.739 186180 DEBUG oslo_concurrency.lockutils [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.740 186180 DEBUG oslo_concurrency.lockutils [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.740 186180 DEBUG oslo_concurrency.lockutils [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.740 186180 DEBUG nova.compute.manager [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] No waiting events found dispatching network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.741 186180 WARNING nova.compute.manager [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received unexpected event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 for instance with vm_state active and task_state migrating.
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.741 186180 DEBUG nova.compute.manager [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.742 186180 DEBUG oslo_concurrency.lockutils [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.742 186180 DEBUG oslo_concurrency.lockutils [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.743 186180 DEBUG oslo_concurrency.lockutils [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.743 186180 DEBUG nova.compute.manager [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] No waiting events found dispatching network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.744 186180 WARNING nova.compute.manager [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received unexpected event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 for instance with vm_state active and task_state migrating.
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.744 186180 DEBUG nova.compute.manager [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.745 186180 DEBUG oslo_concurrency.lockutils [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.745 186180 DEBUG oslo_concurrency.lockutils [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.745 186180 DEBUG oslo_concurrency.lockutils [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.746 186180 DEBUG nova.compute.manager [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] No waiting events found dispatching network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:55:46 compute-0 nova_compute[186176]: 2026-02-16 17:55:46.746 186180 WARNING nova.compute.manager [req-3733a151-ad48-4f10-b022-7dff7d9c30b7 req-40bb435f-d4a4-414c-8283-d6220b8e050d 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received unexpected event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 for instance with vm_state active and task_state migrating.
Feb 16 17:55:47 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Feb 16 17:55:47 compute-0 systemd[217170]: Activating special unit Exit the Session...
Feb 16 17:55:47 compute-0 systemd[217170]: Stopped target Main User Target.
Feb 16 17:55:47 compute-0 systemd[217170]: Stopped target Basic System.
Feb 16 17:55:47 compute-0 systemd[217170]: Stopped target Paths.
Feb 16 17:55:47 compute-0 systemd[217170]: Stopped target Sockets.
Feb 16 17:55:47 compute-0 systemd[217170]: Stopped target Timers.
Feb 16 17:55:47 compute-0 systemd[217170]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 16 17:55:47 compute-0 systemd[217170]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 16 17:55:47 compute-0 systemd[217170]: Closed D-Bus User Message Bus Socket.
Feb 16 17:55:47 compute-0 systemd[217170]: Stopped Create User's Volatile Files and Directories.
Feb 16 17:55:47 compute-0 systemd[217170]: Removed slice User Application Slice.
Feb 16 17:55:47 compute-0 systemd[217170]: Reached target Shutdown.
Feb 16 17:55:47 compute-0 systemd[217170]: Finished Exit the Session.
Feb 16 17:55:47 compute-0 systemd[217170]: Reached target Exit the Session.
Feb 16 17:55:47 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Feb 16 17:55:47 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Feb 16 17:55:47 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 16 17:55:47 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 16 17:55:47 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 16 17:55:47 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 16 17:55:47 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Feb 16 17:55:48 compute-0 nova_compute[186176]: 2026-02-16 17:55:48.923 186180 DEBUG nova.compute.manager [req-fb58ea47-8538-46c6-abe1-1a7c18e7e856 req-955a366b-1a70-4536-bd9d-471fddf2b51e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 16 17:55:48 compute-0 nova_compute[186176]: 2026-02-16 17:55:48.924 186180 DEBUG oslo_concurrency.lockutils [req-fb58ea47-8538-46c6-abe1-1a7c18e7e856 req-955a366b-1a70-4536-bd9d-471fddf2b51e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:48 compute-0 nova_compute[186176]: 2026-02-16 17:55:48.924 186180 DEBUG oslo_concurrency.lockutils [req-fb58ea47-8538-46c6-abe1-1a7c18e7e856 req-955a366b-1a70-4536-bd9d-471fddf2b51e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:48 compute-0 nova_compute[186176]: 2026-02-16 17:55:48.924 186180 DEBUG oslo_concurrency.lockutils [req-fb58ea47-8538-46c6-abe1-1a7c18e7e856 req-955a366b-1a70-4536-bd9d-471fddf2b51e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:48 compute-0 nova_compute[186176]: 2026-02-16 17:55:48.925 186180 DEBUG nova.compute.manager [req-fb58ea47-8538-46c6-abe1-1a7c18e7e856 req-955a366b-1a70-4536-bd9d-471fddf2b51e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] No waiting events found dispatching network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 16 17:55:48 compute-0 nova_compute[186176]: 2026-02-16 17:55:48.925 186180 WARNING nova.compute.manager [req-fb58ea47-8538-46c6-abe1-1a7c18e7e856 req-955a366b-1a70-4536-bd9d-471fddf2b51e 7adcc06f5bc04cd5a1aba80facf6e45c e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Received unexpected event network-vif-plugged-0b70b90a-4ed2-4fcb-af28-752acfd1efd5 for instance with vm_state active and task_state migrating.
Feb 16 17:55:50 compute-0 podman[217298]: 2026-02-16 17:55:50.114876444 +0000 UTC m=+0.084528265 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, name=ubi9/ubi-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 16 17:55:50 compute-0 nova_compute[186176]: 2026-02-16 17:55:50.182 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:50 compute-0 nova_compute[186176]: 2026-02-16 17:55:50.789 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:52 compute-0 podman[217319]: 2026-02-16 17:55:52.105209172 +0000 UTC m=+0.066352829 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.258 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.259 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.259 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.300 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.300 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.301 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.301 186180 DEBUG nova.compute.resource_tracker [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.479 186180 WARNING nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.481 186180 DEBUG nova.compute.resource_tracker [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5793MB free_disk=73.22265625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.481 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.481 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.577 186180 DEBUG nova.compute.resource_tracker [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Migration for instance 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.616 186180 DEBUG nova.compute.resource_tracker [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.678 186180 DEBUG nova.compute.resource_tracker [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Migration 9572d8b7-e9ed-42a2-bd53-4cdb91ae8c96 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.679 186180 DEBUG nova.compute.resource_tracker [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.679 186180 DEBUG nova.compute.resource_tracker [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.727 186180 DEBUG nova.compute.provider_tree [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.749 186180 DEBUG nova.scheduler.client.report [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.776 186180 DEBUG nova.compute.resource_tracker [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.777 186180 DEBUG oslo_concurrency.lockutils [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.781 186180 INFO nova.compute.manager [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.916 186180 INFO nova.scheduler.client.report [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] Deleted allocation for migration 9572d8b7-e9ed-42a2-bd53-4cdb91ae8c96
Feb 16 17:55:52 compute-0 nova_compute[186176]: 2026-02-16 17:55:52.917 186180 DEBUG nova.virt.libvirt.driver [None req-0bac6c89-ca0b-4e7b-beeb-28c3af78dc66 b0deaa82c38641babea4d56d5c6c4f63 e0af68bb83ee44e5b9e317be71e76833 - - default default] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 16 17:55:55 compute-0 nova_compute[186176]: 2026-02-16 17:55:55.792 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:55:55 compute-0 nova_compute[186176]: 2026-02-16 17:55:55.794 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:55:55 compute-0 nova_compute[186176]: 2026-02-16 17:55:55.794 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:55:55 compute-0 nova_compute[186176]: 2026-02-16 17:55:55.794 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:55:56 compute-0 nova_compute[186176]: 2026-02-16 17:55:56.217 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:55:56 compute-0 nova_compute[186176]: 2026-02-16 17:55:56.218 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:55:59 compute-0 nova_compute[186176]: 2026-02-16 17:55:59.055 186180 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771264543.8420403, 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 16 17:55:59 compute-0 nova_compute[186176]: 2026-02-16 17:55:59.056 186180 INFO nova.compute.manager [-] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] VM Stopped (Lifecycle Event)
Feb 16 17:55:59 compute-0 nova_compute[186176]: 2026-02-16 17:55:59.092 186180 DEBUG nova.compute.manager [None req-597a0bf6-a897-4c40-b628-ea043cbf453d - - - - - -] [instance: 5ea6e29c-9e06-4d89-8a8b-aa7c1fbcae34] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 16 17:55:59 compute-0 podman[217340]: 2026-02-16 17:55:59.110982099 +0000 UTC m=+0.073873754 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 17:55:59 compute-0 podman[217339]: 2026-02-16 17:55:59.143823535 +0000 UTC m=+0.108940775 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 16 17:55:59 compute-0 podman[195505]: time="2026-02-16T17:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:55:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:55:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 17:56:00 compute-0 nova_compute[186176]: 2026-02-16 17:56:00.319 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:00 compute-0 nova_compute[186176]: 2026-02-16 17:56:00.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:56:00 compute-0 nova_compute[186176]: 2026-02-16 17:56:00.320 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:56:00 compute-0 nova_compute[186176]: 2026-02-16 17:56:00.468 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:56:01 compute-0 nova_compute[186176]: 2026-02-16 17:56:01.218 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:01 compute-0 nova_compute[186176]: 2026-02-16 17:56:01.221 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:01 compute-0 nova_compute[186176]: 2026-02-16 17:56:01.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:01 compute-0 openstack_network_exporter[198360]: ERROR   17:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:56:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:56:01 compute-0 openstack_network_exporter[198360]: ERROR   17:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:56:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:56:02 compute-0 nova_compute[186176]: 2026-02-16 17:56:02.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:02 compute-0 nova_compute[186176]: 2026-02-16 17:56:02.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:56:03 compute-0 nova_compute[186176]: 2026-02-16 17:56:03.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.356 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.357 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.357 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.358 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.571 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.572 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5805MB free_disk=73.22267532348633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.572 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.573 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.918 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.919 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:56:05 compute-0 nova_compute[186176]: 2026-02-16 17:56:05.984 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:56:06 compute-0 nova_compute[186176]: 2026-02-16 17:56:06.019 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:56:06 compute-0 nova_compute[186176]: 2026-02-16 17:56:06.021 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:56:06 compute-0 nova_compute[186176]: 2026-02-16 17:56:06.021 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:56:06 compute-0 nova_compute[186176]: 2026-02-16 17:56:06.221 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:06 compute-0 nova_compute[186176]: 2026-02-16 17:56:06.222 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:06 compute-0 nova_compute[186176]: 2026-02-16 17:56:06.223 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:56:06 compute-0 nova_compute[186176]: 2026-02-16 17:56:06.223 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:06 compute-0 nova_compute[186176]: 2026-02-16 17:56:06.252 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:06 compute-0 nova_compute[186176]: 2026-02-16 17:56:06.253 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:07 compute-0 nova_compute[186176]: 2026-02-16 17:56:07.017 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:08 compute-0 nova_compute[186176]: 2026-02-16 17:56:08.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:09 compute-0 nova_compute[186176]: 2026-02-16 17:56:09.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:11 compute-0 nova_compute[186176]: 2026-02-16 17:56:11.295 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:11 compute-0 nova_compute[186176]: 2026-02-16 17:56:11.296 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:11 compute-0 nova_compute[186176]: 2026-02-16 17:56:11.296 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5042 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:56:11 compute-0 nova_compute[186176]: 2026-02-16 17:56:11.296 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:11 compute-0 nova_compute[186176]: 2026-02-16 17:56:11.297 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:11 compute-0 nova_compute[186176]: 2026-02-16 17:56:11.297 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:11 compute-0 nova_compute[186176]: 2026-02-16 17:56:11.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:16 compute-0 nova_compute[186176]: 2026-02-16 17:56:16.299 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:21 compute-0 podman[217392]: 2026-02-16 17:56:21.118904324 +0000 UTC m=+0.078241051 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, io.openshift.expose-services=, version=9.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Feb 16 17:56:21 compute-0 nova_compute[186176]: 2026-02-16 17:56:21.303 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:21 compute-0 nova_compute[186176]: 2026-02-16 17:56:21.305 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:21 compute-0 nova_compute[186176]: 2026-02-16 17:56:21.305 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:56:21 compute-0 nova_compute[186176]: 2026-02-16 17:56:21.306 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:21 compute-0 nova_compute[186176]: 2026-02-16 17:56:21.339 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:21 compute-0 nova_compute[186176]: 2026-02-16 17:56:21.339 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:23 compute-0 podman[217413]: 2026-02-16 17:56:23.10838323 +0000 UTC m=+0.077423681 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 16 17:56:24 compute-0 sshd-session[217433]: Connection closed by 64.89.160.135 port 58000
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.317 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.318 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.319 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.319 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.319 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.320 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.340 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.342 186180 DEBUG nova.virt.libvirt.imagecache [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.342 186180 WARNING nova.virt.libvirt.imagecache [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Unknown base file: /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.342 186180 INFO nova.virt.libvirt.imagecache [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Removable base files: /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.343 186180 INFO nova.virt.libvirt.imagecache [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/34459df773b91356960ca90fb27335ee0115c646
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.343 186180 DEBUG nova.virt.libvirt.imagecache [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.343 186180 DEBUG nova.virt.libvirt.imagecache [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.344 186180 DEBUG nova.virt.libvirt.imagecache [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.344 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.344 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.345 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.390 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:26 compute-0 nova_compute[186176]: 2026-02-16 17:56:26.390 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:29 compute-0 podman[195505]: time="2026-02-16T17:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:56:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:56:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 17:56:30 compute-0 podman[217435]: 2026-02-16 17:56:30.090876075 +0000 UTC m=+0.059898221 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:56:30 compute-0 podman[217434]: 2026-02-16 17:56:30.103948516 +0000 UTC m=+0.076631342 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Feb 16 17:56:31 compute-0 nova_compute[186176]: 2026-02-16 17:56:31.392 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:31 compute-0 nova_compute[186176]: 2026-02-16 17:56:31.394 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:31 compute-0 nova_compute[186176]: 2026-02-16 17:56:31.394 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:56:31 compute-0 nova_compute[186176]: 2026-02-16 17:56:31.394 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:31 compute-0 nova_compute[186176]: 2026-02-16 17:56:31.427 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:31 compute-0 openstack_network_exporter[198360]: ERROR   17:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:56:31 compute-0 nova_compute[186176]: 2026-02-16 17:56:31.428 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:56:31 compute-0 openstack_network_exporter[198360]: ERROR   17:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:56:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:56:31 compute-0 nova_compute[186176]: 2026-02-16 17:56:31.428 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:36 compute-0 nova_compute[186176]: 2026-02-16 17:56:36.429 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:36 compute-0 nova_compute[186176]: 2026-02-16 17:56:36.431 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:36 compute-0 nova_compute[186176]: 2026-02-16 17:56:36.431 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:56:36 compute-0 nova_compute[186176]: 2026-02-16 17:56:36.432 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:36 compute-0 nova_compute[186176]: 2026-02-16 17:56:36.432 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:36 compute-0 nova_compute[186176]: 2026-02-16 17:56:36.433 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:56:38.193 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:56:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:56:38.193 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:56:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:56:38.193 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:56:41 compute-0 nova_compute[186176]: 2026-02-16 17:56:41.434 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:44 compute-0 sshd-session[217484]: Invalid user sol from 2.57.122.210 port 35630
Feb 16 17:56:44 compute-0 sshd-session[217484]: Connection closed by invalid user sol 2.57.122.210 port 35630 [preauth]
Feb 16 17:56:46 compute-0 nova_compute[186176]: 2026-02-16 17:56:46.437 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:46 compute-0 nova_compute[186176]: 2026-02-16 17:56:46.439 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:46 compute-0 nova_compute[186176]: 2026-02-16 17:56:46.439 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:56:46 compute-0 nova_compute[186176]: 2026-02-16 17:56:46.439 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:46 compute-0 nova_compute[186176]: 2026-02-16 17:56:46.470 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:46 compute-0 nova_compute[186176]: 2026-02-16 17:56:46.471 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:51 compute-0 nova_compute[186176]: 2026-02-16 17:56:51.472 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:52 compute-0 podman[217486]: 2026-02-16 17:56:52.108210539 +0000 UTC m=+0.078176820 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9/ubi-minimal, distribution-scope=public, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 17:56:53 compute-0 ovn_controller[96437]: 2026-02-16T17:56:53Z|00245|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Feb 16 17:56:54 compute-0 podman[217507]: 2026-02-16 17:56:54.100888503 +0000 UTC m=+0.070382628 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 16 17:56:56 compute-0 nova_compute[186176]: 2026-02-16 17:56:56.478 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:56 compute-0 nova_compute[186176]: 2026-02-16 17:56:56.480 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:56:56 compute-0 nova_compute[186176]: 2026-02-16 17:56:56.480 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:56:56 compute-0 nova_compute[186176]: 2026-02-16 17:56:56.480 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:56 compute-0 nova_compute[186176]: 2026-02-16 17:56:56.519 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:56:56 compute-0 nova_compute[186176]: 2026-02-16 17:56:56.520 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:56:58 compute-0 nova_compute[186176]: 2026-02-16 17:56:58.205 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:56:59 compute-0 podman[195505]: time="2026-02-16T17:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:56:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:56:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 17:57:00 compute-0 nova_compute[186176]: 2026-02-16 17:57:00.334 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:00 compute-0 nova_compute[186176]: 2026-02-16 17:57:00.335 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:57:00 compute-0 nova_compute[186176]: 2026-02-16 17:57:00.335 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:57:00 compute-0 nova_compute[186176]: 2026-02-16 17:57:00.349 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:57:01 compute-0 podman[217527]: 2026-02-16 17:57:01.112116844 +0000 UTC m=+0.077788961 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:57:01 compute-0 podman[217526]: 2026-02-16 17:57:01.156166196 +0000 UTC m=+0.124701343 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 16 17:57:01 compute-0 openstack_network_exporter[198360]: ERROR   17:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:57:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:57:01 compute-0 openstack_network_exporter[198360]: ERROR   17:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:57:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:57:01 compute-0 nova_compute[186176]: 2026-02-16 17:57:01.520 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:57:02 compute-0 nova_compute[186176]: 2026-02-16 17:57:02.327 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:03 compute-0 nova_compute[186176]: 2026-02-16 17:57:03.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:03 compute-0 nova_compute[186176]: 2026-02-16 17:57:03.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:03 compute-0 nova_compute[186176]: 2026-02-16 17:57:03.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:03 compute-0 nova_compute[186176]: 2026-02-16 17:57:03.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:57:06 compute-0 nova_compute[186176]: 2026-02-16 17:57:06.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:06 compute-0 nova_compute[186176]: 2026-02-16 17:57:06.522 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.343 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.344 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.345 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.345 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.508 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.509 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5818MB free_disk=73.22265243530273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.509 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.509 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.573 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.573 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.595 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.618 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.619 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:57:07 compute-0 nova_compute[186176]: 2026-02-16 17:57:07.619 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:57:10 compute-0 nova_compute[186176]: 2026-02-16 17:57:10.619 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:10 compute-0 nova_compute[186176]: 2026-02-16 17:57:10.620 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:11 compute-0 nova_compute[186176]: 2026-02-16 17:57:11.525 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:11 compute-0 nova_compute[186176]: 2026-02-16 17:57:11.527 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:11 compute-0 nova_compute[186176]: 2026-02-16 17:57:11.527 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:57:11 compute-0 nova_compute[186176]: 2026-02-16 17:57:11.528 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:57:11 compute-0 nova_compute[186176]: 2026-02-16 17:57:11.575 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:57:11 compute-0 nova_compute[186176]: 2026-02-16 17:57:11.576 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:57:12 compute-0 nova_compute[186176]: 2026-02-16 17:57:12.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:15 compute-0 nova_compute[186176]: 2026-02-16 17:57:15.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:15 compute-0 nova_compute[186176]: 2026-02-16 17:57:15.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 17:57:15 compute-0 nova_compute[186176]: 2026-02-16 17:57:15.341 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 17:57:15 compute-0 nova_compute[186176]: 2026-02-16 17:57:15.342 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:16 compute-0 nova_compute[186176]: 2026-02-16 17:57:16.578 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:16 compute-0 nova_compute[186176]: 2026-02-16 17:57:16.580 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:16 compute-0 nova_compute[186176]: 2026-02-16 17:57:16.581 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:57:16 compute-0 nova_compute[186176]: 2026-02-16 17:57:16.581 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:57:16 compute-0 nova_compute[186176]: 2026-02-16 17:57:16.582 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:57:16 compute-0 nova_compute[186176]: 2026-02-16 17:57:16.584 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:57:17 compute-0 sshd-session[217579]: Accepted publickey for zuul from 192.168.122.10 port 38568 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:57:17 compute-0 systemd-logind[821]: New session 47 of user zuul.
Feb 16 17:57:17 compute-0 systemd[1]: Started Session 47 of User zuul.
Feb 16 17:57:17 compute-0 sshd-session[217579]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:57:17 compute-0 sudo[217583]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 16 17:57:17 compute-0 sudo[217583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:57:21 compute-0 nova_compute[186176]: 2026-02-16 17:57:21.584 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:57:21 compute-0 ovs-vsctl[217753]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 16 17:57:22 compute-0 nova_compute[186176]: 2026-02-16 17:57:22.346 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:57:22 compute-0 nova_compute[186176]: 2026-02-16 17:57:22.346 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 17:57:22 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 217607 (sos)
Feb 16 17:57:22 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 16 17:57:22 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 16 17:57:22 compute-0 podman[217802]: 2026-02-16 17:57:22.551185212 +0000 UTC m=+0.089480343 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, version=9.7, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1770267347, io.buildah.version=1.33.7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z)
Feb 16 17:57:22 compute-0 virtqemud[185389]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 16 17:57:22 compute-0 virtqemud[185389]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 16 17:57:22 compute-0 virtqemud[185389]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 16 17:57:23 compute-0 crontab[218188]: (root) LIST (root)
Feb 16 17:57:25 compute-0 kernel: /proc/cgroups lists only v1 controllers, use cgroup.controllers of root cgroup for v2 info
Feb 16 17:57:25 compute-0 podman[218262]: 2026-02-16 17:57:25.115923382 +0000 UTC m=+0.074079415 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 17:57:25 compute-0 systemd[1]: Starting Hostname Service...
Feb 16 17:57:25 compute-0 systemd[1]: Started Hostname Service.
Feb 16 17:57:26 compute-0 nova_compute[186176]: 2026-02-16 17:57:26.585 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:57:29 compute-0 podman[195505]: time="2026-02-16T17:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:57:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:57:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2172 "" "Go-http-client/1.1"
Feb 16 17:57:31 compute-0 openstack_network_exporter[198360]: ERROR   17:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:57:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:57:31 compute-0 openstack_network_exporter[198360]: ERROR   17:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:57:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:57:31 compute-0 nova_compute[186176]: 2026-02-16 17:57:31.588 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:31 compute-0 podman[218904]: 2026-02-16 17:57:31.625695765 +0000 UTC m=+0.073803769 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Feb 16 17:57:31 compute-0 podman[218911]: 2026-02-16 17:57:31.631523568 +0000 UTC m=+0.078864023 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:57:33 compute-0 ovs-appctl[219443]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 16 17:57:33 compute-0 ovs-appctl[219447]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 16 17:57:33 compute-0 ovs-appctl[219452]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 16 17:57:36 compute-0 nova_compute[186176]: 2026-02-16 17:57:36.589 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:57:38.198 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:57:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:57:38.202 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:57:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:57:38.202 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:57:39 compute-0 virtqemud[185389]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 16 17:57:40 compute-0 systemd[1]: Starting Time & Date Service...
Feb 16 17:57:41 compute-0 systemd[1]: Started Time & Date Service.
Feb 16 17:57:41 compute-0 nova_compute[186176]: 2026-02-16 17:57:41.592 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:41 compute-0 nova_compute[186176]: 2026-02-16 17:57:41.595 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:41 compute-0 nova_compute[186176]: 2026-02-16 17:57:41.595 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:57:41 compute-0 nova_compute[186176]: 2026-02-16 17:57:41.595 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:57:41 compute-0 nova_compute[186176]: 2026-02-16 17:57:41.596 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:57:41 compute-0 nova_compute[186176]: 2026-02-16 17:57:41.597 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:57:46 compute-0 nova_compute[186176]: 2026-02-16 17:57:46.598 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:51 compute-0 nova_compute[186176]: 2026-02-16 17:57:51.601 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:57:51 compute-0 nova_compute[186176]: 2026-02-16 17:57:51.610 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:57:53 compute-0 podman[220945]: 2026-02-16 17:57:53.131373813 +0000 UTC m=+0.091401319 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc.)
Feb 16 17:57:55 compute-0 podman[220967]: 2026-02-16 17:57:55.210256794 +0000 UTC m=+0.064748076 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 17:57:56 compute-0 nova_compute[186176]: 2026-02-16 17:57:56.607 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:57:56 compute-0 nova_compute[186176]: 2026-02-16 17:57:56.612 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:57:59 compute-0 podman[195505]: time="2026-02-16T17:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:57:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:57:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2176 "" "Go-http-client/1.1"
Feb 16 17:58:00 compute-0 sudo[217583]: pam_unix(sudo:session): session closed for user root
Feb 16 17:58:00 compute-0 sshd-session[217582]: Received disconnect from 192.168.122.10 port 38568:11: disconnected by user
Feb 16 17:58:00 compute-0 sshd-session[217582]: Disconnected from user zuul 192.168.122.10 port 38568
Feb 16 17:58:00 compute-0 sshd-session[217579]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:58:00 compute-0 systemd[1]: session-47.scope: Deactivated successfully.
Feb 16 17:58:00 compute-0 systemd[1]: session-47.scope: Consumed 1min 12.771s CPU time, 629.9M memory peak, read 246.4M from disk, written 26.0M to disk.
Feb 16 17:58:00 compute-0 systemd-logind[821]: Session 47 logged out. Waiting for processes to exit.
Feb 16 17:58:00 compute-0 systemd-logind[821]: Removed session 47.
Feb 16 17:58:00 compute-0 sshd-session[220988]: Accepted publickey for zuul from 192.168.122.10 port 52284 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:58:00 compute-0 systemd-logind[821]: New session 48 of user zuul.
Feb 16 17:58:00 compute-0 systemd[1]: Started Session 48 of User zuul.
Feb 16 17:58:00 compute-0 sshd-session[220988]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:58:00 compute-0 sudo[220992]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2026-02-16-nzaxztq.tar.xz
Feb 16 17:58:00 compute-0 sudo[220992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:58:00 compute-0 nova_compute[186176]: 2026-02-16 17:58:00.333 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:58:00 compute-0 nova_compute[186176]: 2026-02-16 17:58:00.334 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:58:00 compute-0 nova_compute[186176]: 2026-02-16 17:58:00.334 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:58:00 compute-0 nova_compute[186176]: 2026-02-16 17:58:00.362 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:58:00 compute-0 sudo[220992]: pam_unix(sudo:session): session closed for user root
Feb 16 17:58:00 compute-0 sshd-session[220991]: Received disconnect from 192.168.122.10 port 52284:11: disconnected by user
Feb 16 17:58:00 compute-0 sshd-session[220991]: Disconnected from user zuul 192.168.122.10 port 52284
Feb 16 17:58:00 compute-0 sshd-session[220988]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:58:00 compute-0 systemd[1]: session-48.scope: Deactivated successfully.
Feb 16 17:58:00 compute-0 systemd-logind[821]: Session 48 logged out. Waiting for processes to exit.
Feb 16 17:58:00 compute-0 systemd-logind[821]: Removed session 48.
Feb 16 17:58:00 compute-0 sshd-session[221017]: Accepted publickey for zuul from 192.168.122.10 port 52288 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 17:58:00 compute-0 systemd-logind[821]: New session 49 of user zuul.
Feb 16 17:58:00 compute-0 systemd[1]: Started Session 49 of User zuul.
Feb 16 17:58:00 compute-0 sshd-session[221017]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 17:58:00 compute-0 sudo[221021]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Feb 16 17:58:00 compute-0 sudo[221021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 17:58:00 compute-0 sudo[221021]: pam_unix(sudo:session): session closed for user root
Feb 16 17:58:00 compute-0 sshd-session[221020]: Received disconnect from 192.168.122.10 port 52288:11: disconnected by user
Feb 16 17:58:00 compute-0 sshd-session[221020]: Disconnected from user zuul 192.168.122.10 port 52288
Feb 16 17:58:00 compute-0 sshd-session[221017]: pam_unix(sshd:session): session closed for user zuul
Feb 16 17:58:00 compute-0 systemd[1]: session-49.scope: Deactivated successfully.
Feb 16 17:58:00 compute-0 systemd-logind[821]: Session 49 logged out. Waiting for processes to exit.
Feb 16 17:58:00 compute-0 systemd-logind[821]: Removed session 49.
Feb 16 17:58:01 compute-0 openstack_network_exporter[198360]: ERROR   17:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:58:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:58:01 compute-0 openstack_network_exporter[198360]: ERROR   17:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:58:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:58:01 compute-0 nova_compute[186176]: 2026-02-16 17:58:01.614 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:58:01 compute-0 nova_compute[186176]: 2026-02-16 17:58:01.616 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:58:01 compute-0 nova_compute[186176]: 2026-02-16 17:58:01.616 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:58:01 compute-0 nova_compute[186176]: 2026-02-16 17:58:01.616 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:58:01 compute-0 nova_compute[186176]: 2026-02-16 17:58:01.857 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:01 compute-0 nova_compute[186176]: 2026-02-16 17:58:01.857 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:58:02 compute-0 podman[221047]: 2026-02-16 17:58:02.077836939 +0000 UTC m=+0.052003534 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 17:58:02 compute-0 podman[221046]: 2026-02-16 17:58:02.127946567 +0000 UTC m=+0.102652785 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 16 17:58:05 compute-0 nova_compute[186176]: 2026-02-16 17:58:05.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:58:05 compute-0 nova_compute[186176]: 2026-02-16 17:58:05.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:58:05 compute-0 nova_compute[186176]: 2026-02-16 17:58:05.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:58:05 compute-0 nova_compute[186176]: 2026-02-16 17:58:05.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:58:06 compute-0 nova_compute[186176]: 2026-02-16 17:58:06.858 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:58:06 compute-0 nova_compute[186176]: 2026-02-16 17:58:06.859 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:06 compute-0 nova_compute[186176]: 2026-02-16 17:58:06.859 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:58:06 compute-0 nova_compute[186176]: 2026-02-16 17:58:06.860 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:58:06 compute-0 nova_compute[186176]: 2026-02-16 17:58:06.861 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:58:06 compute-0 nova_compute[186176]: 2026-02-16 17:58:06.862 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.353 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.354 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.355 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.355 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.524 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.526 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5501MB free_disk=73.22201156616211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.526 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.527 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.657 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.658 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.686 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.711 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.714 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:58:07 compute-0 nova_compute[186176]: 2026-02-16 17:58:07.715 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:58:11 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 16 17:58:11 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 16 17:58:11 compute-0 nova_compute[186176]: 2026-02-16 17:58:11.862 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:12 compute-0 nova_compute[186176]: 2026-02-16 17:58:12.716 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:58:12 compute-0 nova_compute[186176]: 2026-02-16 17:58:12.717 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:58:13 compute-0 nova_compute[186176]: 2026-02-16 17:58:13.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:58:16 compute-0 nova_compute[186176]: 2026-02-16 17:58:16.862 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:21 compute-0 nova_compute[186176]: 2026-02-16 17:58:21.865 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:24 compute-0 podman[221103]: 2026-02-16 17:58:24.099593889 +0000 UTC m=+0.070424026 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1770267347, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=)
Feb 16 17:58:26 compute-0 podman[221124]: 2026-02-16 17:58:26.117890077 +0000 UTC m=+0.077760535 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:58:26 compute-0 nova_compute[186176]: 2026-02-16 17:58:26.866 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:29 compute-0 podman[195505]: time="2026-02-16T17:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:58:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:58:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 17:58:31 compute-0 openstack_network_exporter[198360]: ERROR   17:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:58:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:58:31 compute-0 openstack_network_exporter[198360]: ERROR   17:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:58:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:58:31 compute-0 nova_compute[186176]: 2026-02-16 17:58:31.868 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:31 compute-0 nova_compute[186176]: 2026-02-16 17:58:31.871 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:33 compute-0 podman[221144]: 2026-02-16 17:58:33.08788329 +0000 UTC m=+0.055859179 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:58:33 compute-0 podman[221143]: 2026-02-16 17:58:33.14994869 +0000 UTC m=+0.120691097 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 16 17:58:36 compute-0 nova_compute[186176]: 2026-02-16 17:58:36.871 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:58:38.199 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:58:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:58:38.200 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:58:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:58:38.200 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:58:41 compute-0 nova_compute[186176]: 2026-02-16 17:58:41.873 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:46 compute-0 nova_compute[186176]: 2026-02-16 17:58:46.875 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:51 compute-0 nova_compute[186176]: 2026-02-16 17:58:51.877 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:55 compute-0 podman[221191]: 2026-02-16 17:58:55.122136784 +0000 UTC m=+0.090231101 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, distribution-scope=public, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Feb 16 17:58:56 compute-0 nova_compute[186176]: 2026-02-16 17:58:56.879 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:58:56 compute-0 nova_compute[186176]: 2026-02-16 17:58:56.880 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:56 compute-0 nova_compute[186176]: 2026-02-16 17:58:56.880 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:58:56 compute-0 nova_compute[186176]: 2026-02-16 17:58:56.881 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:58:56 compute-0 nova_compute[186176]: 2026-02-16 17:58:56.881 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:58:56 compute-0 nova_compute[186176]: 2026-02-16 17:58:56.883 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:58:57 compute-0 podman[221212]: 2026-02-16 17:58:57.093848659 +0000 UTC m=+0.064979742 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 16 17:58:59 compute-0 podman[195505]: time="2026-02-16T17:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:58:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:58:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 17:59:00 compute-0 sshd-session[221231]: Invalid user sol from 2.57.122.210 port 38244
Feb 16 17:59:00 compute-0 sshd-session[221231]: Connection closed by invalid user sol 2.57.122.210 port 38244 [preauth]
Feb 16 17:59:01 compute-0 openstack_network_exporter[198360]: ERROR   17:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:59:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:59:01 compute-0 openstack_network_exporter[198360]: ERROR   17:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:59:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:59:01 compute-0 nova_compute[186176]: 2026-02-16 17:59:01.882 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:02 compute-0 nova_compute[186176]: 2026-02-16 17:59:02.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:59:02 compute-0 nova_compute[186176]: 2026-02-16 17:59:02.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 17:59:02 compute-0 nova_compute[186176]: 2026-02-16 17:59:02.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 17:59:02 compute-0 nova_compute[186176]: 2026-02-16 17:59:02.349 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 17:59:03 compute-0 nova_compute[186176]: 2026-02-16 17:59:03.343 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:59:04 compute-0 podman[221234]: 2026-02-16 17:59:04.125410731 +0000 UTC m=+0.079062328 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 17:59:04 compute-0 podman[221233]: 2026-02-16 17:59:04.157035105 +0000 UTC m=+0.117206951 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:59:05 compute-0 nova_compute[186176]: 2026-02-16 17:59:05.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:59:06 compute-0 nova_compute[186176]: 2026-02-16 17:59:06.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:59:06 compute-0 nova_compute[186176]: 2026-02-16 17:59:06.885 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.311 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.316 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.411 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.411 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.412 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.412 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.589 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.590 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5692MB free_disk=73.22227478027344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.590 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.590 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.855 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.856 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.903 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.953 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.955 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 17:59:07 compute-0 nova_compute[186176]: 2026-02-16 17:59:07.956 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:59:11 compute-0 nova_compute[186176]: 2026-02-16 17:59:11.887 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:12 compute-0 nova_compute[186176]: 2026-02-16 17:59:12.957 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:59:13 compute-0 nova_compute[186176]: 2026-02-16 17:59:13.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:59:14 compute-0 nova_compute[186176]: 2026-02-16 17:59:14.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 17:59:16 compute-0 nova_compute[186176]: 2026-02-16 17:59:16.888 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:21 compute-0 nova_compute[186176]: 2026-02-16 17:59:21.890 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:26 compute-0 podman[221284]: 2026-02-16 17:59:26.101174 +0000 UTC m=+0.074743182 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 17:59:26 compute-0 nova_compute[186176]: 2026-02-16 17:59:26.893 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:59:28 compute-0 podman[221305]: 2026-02-16 17:59:28.107120804 +0000 UTC m=+0.079805425 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 16 17:59:29 compute-0 podman[195505]: time="2026-02-16T17:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:59:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:59:29 compute-0 podman[195505]: @ - - [16/Feb/2026:17:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 17:59:31 compute-0 openstack_network_exporter[198360]: ERROR   17:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 17:59:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:59:31 compute-0 openstack_network_exporter[198360]: ERROR   17:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 17:59:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 17:59:31 compute-0 nova_compute[186176]: 2026-02-16 17:59:31.894 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:35 compute-0 podman[221325]: 2026-02-16 17:59:35.115291732 +0000 UTC m=+0.075164722 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 17:59:35 compute-0 podman[221324]: 2026-02-16 17:59:35.175932877 +0000 UTC m=+0.142120671 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 17:59:36 compute-0 nova_compute[186176]: 2026-02-16 17:59:36.896 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:59:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:59:38.200 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 17:59:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:59:38.200 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 17:59:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 17:59:38.200 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 17:59:41 compute-0 nova_compute[186176]: 2026-02-16 17:59:41.898 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:59:46 compute-0 nova_compute[186176]: 2026-02-16 17:59:46.900 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:59:46 compute-0 nova_compute[186176]: 2026-02-16 17:59:46.902 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:46 compute-0 nova_compute[186176]: 2026-02-16 17:59:46.902 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:59:46 compute-0 nova_compute[186176]: 2026-02-16 17:59:46.902 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:59:46 compute-0 nova_compute[186176]: 2026-02-16 17:59:46.903 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:59:46 compute-0 nova_compute[186176]: 2026-02-16 17:59:46.904 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:51 compute-0 nova_compute[186176]: 2026-02-16 17:59:51.904 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:56 compute-0 nova_compute[186176]: 2026-02-16 17:59:56.906 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 17:59:56 compute-0 nova_compute[186176]: 2026-02-16 17:59:56.906 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:56 compute-0 nova_compute[186176]: 2026-02-16 17:59:56.907 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 17:59:56 compute-0 nova_compute[186176]: 2026-02-16 17:59:56.907 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:59:56 compute-0 nova_compute[186176]: 2026-02-16 17:59:56.907 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 17:59:56 compute-0 nova_compute[186176]: 2026-02-16 17:59:56.909 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 17:59:57 compute-0 podman[221372]: 2026-02-16 17:59:57.115133931 +0000 UTC m=+0.081552778 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., release=1770267347, architecture=x86_64, build-date=2026-02-05T04:57:10Z, distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Feb 16 17:59:59 compute-0 podman[221395]: 2026-02-16 17:59:59.116248219 +0000 UTC m=+0.076678949 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 16 17:59:59 compute-0 podman[195505]: time="2026-02-16T17:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 17:59:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 17:59:59 compute-0 podman[195505]: @ - - [16/Feb/2026:17:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Feb 16 18:00:01 compute-0 openstack_network_exporter[198360]: ERROR   18:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:00:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:00:01 compute-0 openstack_network_exporter[198360]: ERROR   18:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:00:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:00:01 compute-0 nova_compute[186176]: 2026-02-16 18:00:01.909 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:02 compute-0 nova_compute[186176]: 2026-02-16 18:00:02.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:00:02 compute-0 nova_compute[186176]: 2026-02-16 18:00:02.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 18:00:02 compute-0 nova_compute[186176]: 2026-02-16 18:00:02.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 18:00:02 compute-0 nova_compute[186176]: 2026-02-16 18:00:02.340 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 18:00:05 compute-0 nova_compute[186176]: 2026-02-16 18:00:05.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:00:06 compute-0 podman[221416]: 2026-02-16 18:00:06.094991886 +0000 UTC m=+0.062585354 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 18:00:06 compute-0 podman[221415]: 2026-02-16 18:00:06.178775338 +0000 UTC m=+0.148979940 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb 16 18:00:06 compute-0 nova_compute[186176]: 2026-02-16 18:00:06.910 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.347 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.347 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.347 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.347 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.546 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.547 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5734MB free_disk=73.22225570678711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.548 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.548 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.646 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.646 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.667 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.728 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.728 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.744 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.771 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.793 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.810 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.812 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 18:00:07 compute-0 nova_compute[186176]: 2026-02-16 18:00:07.812 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:00:08 compute-0 nova_compute[186176]: 2026-02-16 18:00:08.806 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:00:08 compute-0 nova_compute[186176]: 2026-02-16 18:00:08.807 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:00:08 compute-0 nova_compute[186176]: 2026-02-16 18:00:08.807 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:00:08 compute-0 nova_compute[186176]: 2026-02-16 18:00:08.807 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 18:00:11 compute-0 nova_compute[186176]: 2026-02-16 18:00:11.912 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:00:11 compute-0 nova_compute[186176]: 2026-02-16 18:00:11.914 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:11 compute-0 nova_compute[186176]: 2026-02-16 18:00:11.914 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:00:11 compute-0 nova_compute[186176]: 2026-02-16 18:00:11.915 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:00:11 compute-0 nova_compute[186176]: 2026-02-16 18:00:11.915 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:00:11 compute-0 nova_compute[186176]: 2026-02-16 18:00:11.917 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:12 compute-0 nova_compute[186176]: 2026-02-16 18:00:12.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:00:13 compute-0 nova_compute[186176]: 2026-02-16 18:00:13.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:00:14 compute-0 nova_compute[186176]: 2026-02-16 18:00:14.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:00:16 compute-0 nova_compute[186176]: 2026-02-16 18:00:16.915 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:16 compute-0 nova_compute[186176]: 2026-02-16 18:00:16.917 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:21 compute-0 nova_compute[186176]: 2026-02-16 18:00:21.917 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:26 compute-0 nova_compute[186176]: 2026-02-16 18:00:26.919 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:00:28 compute-0 podman[221465]: 2026-02-16 18:00:28.094726845 +0000 UTC m=+0.063041615 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=)
Feb 16 18:00:29 compute-0 podman[195505]: time="2026-02-16T18:00:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:00:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:00:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:00:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:00:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2177 "" "Go-http-client/1.1"
Feb 16 18:00:30 compute-0 podman[221487]: 2026-02-16 18:00:30.083356476 +0000 UTC m=+0.051040481 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 16 18:00:31 compute-0 openstack_network_exporter[198360]: ERROR   18:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:00:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:00:31 compute-0 openstack_network_exporter[198360]: ERROR   18:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:00:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:00:31 compute-0 nova_compute[186176]: 2026-02-16 18:00:31.921 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:00:31 compute-0 nova_compute[186176]: 2026-02-16 18:00:31.922 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:36 compute-0 nova_compute[186176]: 2026-02-16 18:00:36.923 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:00:36 compute-0 nova_compute[186176]: 2026-02-16 18:00:36.925 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:00:36 compute-0 nova_compute[186176]: 2026-02-16 18:00:36.926 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:00:36 compute-0 nova_compute[186176]: 2026-02-16 18:00:36.926 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:00:36 compute-0 nova_compute[186176]: 2026-02-16 18:00:36.959 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:36 compute-0 nova_compute[186176]: 2026-02-16 18:00:36.960 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:00:37 compute-0 podman[221508]: 2026-02-16 18:00:37.103115978 +0000 UTC m=+0.069594106 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 18:00:37 compute-0 podman[221507]: 2026-02-16 18:00:37.124954373 +0000 UTC m=+0.094235199 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 16 18:00:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:00:38.202 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:00:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:00:38.202 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:00:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:00:38.202 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:00:41 compute-0 nova_compute[186176]: 2026-02-16 18:00:41.960 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:46 compute-0 nova_compute[186176]: 2026-02-16 18:00:46.962 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:51 compute-0 nova_compute[186176]: 2026-02-16 18:00:51.964 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:56 compute-0 nova_compute[186176]: 2026-02-16 18:00:56.967 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:00:56 compute-0 nova_compute[186176]: 2026-02-16 18:00:56.969 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:00:56 compute-0 nova_compute[186176]: 2026-02-16 18:00:56.970 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:00:56 compute-0 nova_compute[186176]: 2026-02-16 18:00:56.970 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:00:57 compute-0 nova_compute[186176]: 2026-02-16 18:00:57.016 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:00:57 compute-0 nova_compute[186176]: 2026-02-16 18:00:57.017 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:00:59 compute-0 podman[221557]: 2026-02-16 18:00:59.111346443 +0000 UTC m=+0.078394201 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, version=9.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64)
Feb 16 18:00:59 compute-0 podman[195505]: time="2026-02-16T18:00:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:00:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:00:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:00:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:00:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 18:01:01 compute-0 podman[221578]: 2026-02-16 18:01:01.093368912 +0000 UTC m=+0.065736031 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 16 18:01:01 compute-0 openstack_network_exporter[198360]: ERROR   18:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:01:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:01:01 compute-0 openstack_network_exporter[198360]: ERROR   18:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:01:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:01:01 compute-0 CROND[221599]: (root) CMD (run-parts /etc/cron.hourly)
Feb 16 18:01:01 compute-0 run-parts[221602]: (/etc/cron.hourly) starting 0anacron
Feb 16 18:01:01 compute-0 run-parts[221608]: (/etc/cron.hourly) finished 0anacron
Feb 16 18:01:01 compute-0 CROND[221598]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 16 18:01:02 compute-0 nova_compute[186176]: 2026-02-16 18:01:02.018 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:02 compute-0 nova_compute[186176]: 2026-02-16 18:01:02.020 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:02 compute-0 nova_compute[186176]: 2026-02-16 18:01:02.020 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:01:02 compute-0 nova_compute[186176]: 2026-02-16 18:01:02.020 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:01:02 compute-0 nova_compute[186176]: 2026-02-16 18:01:02.064 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:01:02 compute-0 nova_compute[186176]: 2026-02-16 18:01:02.065 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:01:04 compute-0 nova_compute[186176]: 2026-02-16 18:01:04.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:01:04 compute-0 nova_compute[186176]: 2026-02-16 18:01:04.327 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:01:04 compute-0 nova_compute[186176]: 2026-02-16 18:01:04.328 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 18:01:04 compute-0 nova_compute[186176]: 2026-02-16 18:01:04.328 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 18:01:04 compute-0 nova_compute[186176]: 2026-02-16 18:01:04.340 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 18:01:06 compute-0 nova_compute[186176]: 2026-02-16 18:01:06.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.066 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.068 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.068 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.069 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.113 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.114 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.486 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.487 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.487 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.488 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.619 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.620 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5783MB free_disk=73.22246551513672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.621 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.621 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.677 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.677 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.695 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.708 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.709 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 18:01:07 compute-0 nova_compute[186176]: 2026-02-16 18:01:07.710 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:01:08 compute-0 podman[221610]: 2026-02-16 18:01:08.078101664 +0000 UTC m=+0.048371116 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 16 18:01:08 compute-0 podman[221609]: 2026-02-16 18:01:08.100760379 +0000 UTC m=+0.074393283 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 16 18:01:08 compute-0 nova_compute[186176]: 2026-02-16 18:01:08.710 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:01:08 compute-0 nova_compute[186176]: 2026-02-16 18:01:08.711 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:01:08 compute-0 nova_compute[186176]: 2026-02-16 18:01:08.711 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 18:01:09 compute-0 nova_compute[186176]: 2026-02-16 18:01:09.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:01:12 compute-0 nova_compute[186176]: 2026-02-16 18:01:12.115 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:01:14 compute-0 nova_compute[186176]: 2026-02-16 18:01:14.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:01:14 compute-0 nova_compute[186176]: 2026-02-16 18:01:14.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:01:14 compute-0 sshd-session[221656]: Invalid user ubuntu from 2.57.122.210 port 40862
Feb 16 18:01:14 compute-0 sshd-session[221656]: Connection closed by invalid user ubuntu 2.57.122.210 port 40862 [preauth]
Feb 16 18:01:15 compute-0 nova_compute[186176]: 2026-02-16 18:01:15.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:01:17 compute-0 nova_compute[186176]: 2026-02-16 18:01:17.116 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:01:22 compute-0 nova_compute[186176]: 2026-02-16 18:01:22.118 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:22 compute-0 nova_compute[186176]: 2026-02-16 18:01:22.160 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:22 compute-0 nova_compute[186176]: 2026-02-16 18:01:22.160 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5042 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:01:22 compute-0 nova_compute[186176]: 2026-02-16 18:01:22.161 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:01:22 compute-0 nova_compute[186176]: 2026-02-16 18:01:22.161 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:01:22 compute-0 nova_compute[186176]: 2026-02-16 18:01:22.163 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:01:27 compute-0 nova_compute[186176]: 2026-02-16 18:01:27.163 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:01:29 compute-0 podman[195505]: time="2026-02-16T18:01:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:01:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:01:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:01:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:01:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Feb 16 18:01:29 compute-0 podman[221658]: 2026-02-16 18:01:29.879988062 +0000 UTC m=+0.102246322 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1770267347, vcs-type=git, version=9.7, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc.)
Feb 16 18:01:31 compute-0 openstack_network_exporter[198360]: ERROR   18:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:01:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:01:31 compute-0 openstack_network_exporter[198360]: ERROR   18:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:01:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:01:32 compute-0 podman[221680]: 2026-02-16 18:01:32.093177927 +0000 UTC m=+0.063915100 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 16 18:01:32 compute-0 nova_compute[186176]: 2026-02-16 18:01:32.165 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:01:37 compute-0 nova_compute[186176]: 2026-02-16 18:01:37.167 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:37 compute-0 nova_compute[186176]: 2026-02-16 18:01:37.207 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:37 compute-0 nova_compute[186176]: 2026-02-16 18:01:37.208 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:01:37 compute-0 nova_compute[186176]: 2026-02-16 18:01:37.208 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:01:37 compute-0 nova_compute[186176]: 2026-02-16 18:01:37.209 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:01:37 compute-0 nova_compute[186176]: 2026-02-16 18:01:37.209 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:01:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:01:38.203 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:01:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:01:38.204 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:01:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:01:38.204 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:01:39 compute-0 podman[221700]: 2026-02-16 18:01:39.121442863 +0000 UTC m=+0.092304667 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 16 18:01:39 compute-0 podman[221701]: 2026-02-16 18:01:39.129730987 +0000 UTC m=+0.092905342 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 18:01:42 compute-0 nova_compute[186176]: 2026-02-16 18:01:42.210 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:42 compute-0 nova_compute[186176]: 2026-02-16 18:01:42.211 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:01:47 compute-0 nova_compute[186176]: 2026-02-16 18:01:47.213 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:52 compute-0 nova_compute[186176]: 2026-02-16 18:01:52.215 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:52 compute-0 nova_compute[186176]: 2026-02-16 18:01:52.217 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:52 compute-0 nova_compute[186176]: 2026-02-16 18:01:52.217 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:01:52 compute-0 nova_compute[186176]: 2026-02-16 18:01:52.217 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:01:52 compute-0 nova_compute[186176]: 2026-02-16 18:01:52.267 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:01:52 compute-0 nova_compute[186176]: 2026-02-16 18:01:52.268 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:01:56 compute-0 nova_compute[186176]: 2026-02-16 18:01:56.996 186180 DEBUG oslo_concurrency.processutils [None req-a613f7e7-5312-411a-b4f9-c2c3fff52691 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 16 18:01:57 compute-0 nova_compute[186176]: 2026-02-16 18:01:57.011 186180 DEBUG oslo_concurrency.processutils [None req-a613f7e7-5312-411a-b4f9-c2c3fff52691 4997dcc8f79845c495ee8716f72d604c 1153d82e3c954635916cdffc75cdb267 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 16 18:01:57 compute-0 nova_compute[186176]: 2026-02-16 18:01:57.304 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:01:59 compute-0 podman[195505]: time="2026-02-16T18:01:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:01:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:01:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:01:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:01:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Feb 16 18:02:00 compute-0 podman[221751]: 2026-02-16 18:02:00.099393478 +0000 UTC m=+0.062099416 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.7, release=1770267347, architecture=x86_64, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 16 18:02:01 compute-0 openstack_network_exporter[198360]: ERROR   18:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:02:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:02:01 compute-0 openstack_network_exporter[198360]: ERROR   18:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:02:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:02:02 compute-0 nova_compute[186176]: 2026-02-16 18:02:02.307 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:02 compute-0 nova_compute[186176]: 2026-02-16 18:02:02.309 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:02 compute-0 nova_compute[186176]: 2026-02-16 18:02:02.309 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:02:02 compute-0 nova_compute[186176]: 2026-02-16 18:02:02.309 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:02 compute-0 nova_compute[186176]: 2026-02-16 18:02:02.345 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:02 compute-0 nova_compute[186176]: 2026-02-16 18:02:02.345 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:02 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:02:02.826 105730 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'be:71:23', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3e:85:2d:ea:59:27'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 16 18:02:02 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:02:02.827 105730 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 16 18:02:02 compute-0 nova_compute[186176]: 2026-02-16 18:02:02.827 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:03 compute-0 podman[221772]: 2026-02-16 18:02:03.101812465 +0000 UTC m=+0.071978959 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 18:02:03 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:02:03.830 105730 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=09f26141-c730-49d9-ad1c-7063ea4246fa, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 16 18:02:06 compute-0 nova_compute[186176]: 2026-02-16 18:02:06.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:06 compute-0 nova_compute[186176]: 2026-02-16 18:02:06.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 18:02:06 compute-0 nova_compute[186176]: 2026-02-16 18:02:06.319 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 18:02:06 compute-0 nova_compute[186176]: 2026-02-16 18:02:06.335 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.337 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.338 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.338 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.338 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.408 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.535 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.536 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5786MB free_disk=73.22223663330078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.536 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.536 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.596 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.596 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.614 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.626 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.627 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 18:02:07 compute-0 nova_compute[186176]: 2026-02-16 18:02:07.627 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:02:10 compute-0 podman[221794]: 2026-02-16 18:02:10.096337962 +0000 UTC m=+0.060369234 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 18:02:10 compute-0 podman[221793]: 2026-02-16 18:02:10.107584598 +0000 UTC m=+0.077879344 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 16 18:02:10 compute-0 nova_compute[186176]: 2026-02-16 18:02:10.629 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:10 compute-0 nova_compute[186176]: 2026-02-16 18:02:10.630 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:10 compute-0 nova_compute[186176]: 2026-02-16 18:02:10.630 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 18:02:11 compute-0 nova_compute[186176]: 2026-02-16 18:02:11.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:12 compute-0 nova_compute[186176]: 2026-02-16 18:02:12.412 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:14 compute-0 nova_compute[186176]: 2026-02-16 18:02:14.319 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:16 compute-0 nova_compute[186176]: 2026-02-16 18:02:16.319 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:17 compute-0 nova_compute[186176]: 2026-02-16 18:02:17.319 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:17 compute-0 nova_compute[186176]: 2026-02-16 18:02:17.414 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:21 compute-0 nova_compute[186176]: 2026-02-16 18:02:21.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:21 compute-0 nova_compute[186176]: 2026-02-16 18:02:21.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 18:02:21 compute-0 nova_compute[186176]: 2026-02-16 18:02:21.338 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 18:02:22 compute-0 nova_compute[186176]: 2026-02-16 18:02:22.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:22 compute-0 nova_compute[186176]: 2026-02-16 18:02:22.417 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:22 compute-0 nova_compute[186176]: 2026-02-16 18:02:22.418 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:22 compute-0 nova_compute[186176]: 2026-02-16 18:02:22.419 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:02:22 compute-0 nova_compute[186176]: 2026-02-16 18:02:22.419 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:22 compute-0 nova_compute[186176]: 2026-02-16 18:02:22.455 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:22 compute-0 nova_compute[186176]: 2026-02-16 18:02:22.456 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:27 compute-0 nova_compute[186176]: 2026-02-16 18:02:27.457 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:29 compute-0 podman[195505]: time="2026-02-16T18:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:02:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:02:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:02:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Feb 16 18:02:31 compute-0 podman[221843]: 2026-02-16 18:02:31.112478675 +0000 UTC m=+0.074990062 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 16 18:02:31 compute-0 openstack_network_exporter[198360]: ERROR   18:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:02:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:02:31 compute-0 openstack_network_exporter[198360]: ERROR   18:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:02:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:02:32 compute-0 nova_compute[186176]: 2026-02-16 18:02:32.459 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:32 compute-0 nova_compute[186176]: 2026-02-16 18:02:32.461 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:33 compute-0 nova_compute[186176]: 2026-02-16 18:02:33.328 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:02:33 compute-0 nova_compute[186176]: 2026-02-16 18:02:33.328 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 16 18:02:34 compute-0 podman[221865]: 2026-02-16 18:02:34.10181641 +0000 UTC m=+0.073316621 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 16 18:02:37 compute-0 nova_compute[186176]: 2026-02-16 18:02:37.462 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:02:38.204 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:02:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:02:38.205 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:02:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:02:38.205 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:02:41 compute-0 podman[221885]: 2026-02-16 18:02:41.081099534 +0000 UTC m=+0.050596683 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 18:02:41 compute-0 podman[221884]: 2026-02-16 18:02:41.105670378 +0000 UTC m=+0.080361735 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 16 18:02:42 compute-0 nova_compute[186176]: 2026-02-16 18:02:42.464 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:42 compute-0 nova_compute[186176]: 2026-02-16 18:02:42.465 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:42 compute-0 nova_compute[186176]: 2026-02-16 18:02:42.465 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:02:42 compute-0 nova_compute[186176]: 2026-02-16 18:02:42.466 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:42 compute-0 nova_compute[186176]: 2026-02-16 18:02:42.466 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:42 compute-0 nova_compute[186176]: 2026-02-16 18:02:42.467 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:47 compute-0 nova_compute[186176]: 2026-02-16 18:02:47.468 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:47 compute-0 nova_compute[186176]: 2026-02-16 18:02:47.470 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:47 compute-0 nova_compute[186176]: 2026-02-16 18:02:47.471 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:02:47 compute-0 nova_compute[186176]: 2026-02-16 18:02:47.471 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:47 compute-0 nova_compute[186176]: 2026-02-16 18:02:47.516 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:47 compute-0 nova_compute[186176]: 2026-02-16 18:02:47.517 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:52 compute-0 nova_compute[186176]: 2026-02-16 18:02:52.518 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:52 compute-0 nova_compute[186176]: 2026-02-16 18:02:52.520 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:52 compute-0 nova_compute[186176]: 2026-02-16 18:02:52.520 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:02:52 compute-0 nova_compute[186176]: 2026-02-16 18:02:52.521 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:52 compute-0 nova_compute[186176]: 2026-02-16 18:02:52.530 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:52 compute-0 nova_compute[186176]: 2026-02-16 18:02:52.531 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:57 compute-0 nova_compute[186176]: 2026-02-16 18:02:57.532 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:57 compute-0 nova_compute[186176]: 2026-02-16 18:02:57.534 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:02:57 compute-0 nova_compute[186176]: 2026-02-16 18:02:57.534 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:02:57 compute-0 nova_compute[186176]: 2026-02-16 18:02:57.534 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:57 compute-0 nova_compute[186176]: 2026-02-16 18:02:57.561 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:02:57 compute-0 nova_compute[186176]: 2026-02-16 18:02:57.562 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:02:59 compute-0 podman[195505]: time="2026-02-16T18:02:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:02:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:02:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:02:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:02:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 18:03:01 compute-0 openstack_network_exporter[198360]: ERROR   18:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:03:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:03:01 compute-0 openstack_network_exporter[198360]: ERROR   18:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:03:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:03:02 compute-0 podman[221934]: 2026-02-16 18:03:02.095139068 +0000 UTC m=+0.064141366 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Feb 16 18:03:02 compute-0 nova_compute[186176]: 2026-02-16 18:03:02.563 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:02 compute-0 nova_compute[186176]: 2026-02-16 18:03:02.564 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:02 compute-0 nova_compute[186176]: 2026-02-16 18:03:02.564 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:03:02 compute-0 nova_compute[186176]: 2026-02-16 18:03:02.564 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:02 compute-0 nova_compute[186176]: 2026-02-16 18:03:02.565 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:02 compute-0 nova_compute[186176]: 2026-02-16 18:03:02.566 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:05 compute-0 podman[221955]: 2026-02-16 18:03:05.073926553 +0000 UTC m=+0.046923663 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Feb 16 18:03:07 compute-0 nova_compute[186176]: 2026-02-16 18:03:07.567 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:07 compute-0 nova_compute[186176]: 2026-02-16 18:03:07.569 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:07 compute-0 nova_compute[186176]: 2026-02-16 18:03:07.569 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:03:07 compute-0 nova_compute[186176]: 2026-02-16 18:03:07.569 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:07 compute-0 nova_compute[186176]: 2026-02-16 18:03:07.618 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:07 compute-0 nova_compute[186176]: 2026-02-16 18:03:07.619 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.336 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.337 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.337 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.398 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.399 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.433 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.434 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.434 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.434 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.647 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.648 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5796MB free_disk=73.22232437133789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.648 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.648 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.935 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.935 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.961 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.997 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.998 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 18:03:08 compute-0 nova_compute[186176]: 2026-02-16 18:03:08.998 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:03:09 compute-0 nova_compute[186176]: 2026-02-16 18:03:09.917 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:03:09 compute-0 nova_compute[186176]: 2026-02-16 18:03:09.958 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:03:10 compute-0 nova_compute[186176]: 2026-02-16 18:03:10.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:03:10 compute-0 nova_compute[186176]: 2026-02-16 18:03:10.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 18:03:11 compute-0 nova_compute[186176]: 2026-02-16 18:03:11.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:03:12 compute-0 podman[221976]: 2026-02-16 18:03:12.109302674 +0000 UTC m=+0.073172648 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 18:03:12 compute-0 podman[221975]: 2026-02-16 18:03:12.135752143 +0000 UTC m=+0.104205590 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 16 18:03:12 compute-0 nova_compute[186176]: 2026-02-16 18:03:12.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:03:12 compute-0 nova_compute[186176]: 2026-02-16 18:03:12.619 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:12 compute-0 nova_compute[186176]: 2026-02-16 18:03:12.621 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:14 compute-0 nova_compute[186176]: 2026-02-16 18:03:14.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:03:17 compute-0 nova_compute[186176]: 2026-02-16 18:03:17.623 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:17 compute-0 nova_compute[186176]: 2026-02-16 18:03:17.625 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:17 compute-0 nova_compute[186176]: 2026-02-16 18:03:17.625 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:03:17 compute-0 nova_compute[186176]: 2026-02-16 18:03:17.625 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:17 compute-0 nova_compute[186176]: 2026-02-16 18:03:17.653 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:17 compute-0 nova_compute[186176]: 2026-02-16 18:03:17.654 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:18 compute-0 nova_compute[186176]: 2026-02-16 18:03:18.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:03:18 compute-0 nova_compute[186176]: 2026-02-16 18:03:18.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:03:22 compute-0 nova_compute[186176]: 2026-02-16 18:03:22.654 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:23 compute-0 sshd-session[222025]: Invalid user solana from 2.57.122.210 port 43456
Feb 16 18:03:23 compute-0 sshd-session[222025]: Connection closed by invalid user solana 2.57.122.210 port 43456 [preauth]
Feb 16 18:03:27 compute-0 nova_compute[186176]: 2026-02-16 18:03:27.656 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:27 compute-0 nova_compute[186176]: 2026-02-16 18:03:27.658 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:27 compute-0 nova_compute[186176]: 2026-02-16 18:03:27.659 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:03:27 compute-0 nova_compute[186176]: 2026-02-16 18:03:27.659 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:27 compute-0 nova_compute[186176]: 2026-02-16 18:03:27.692 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:27 compute-0 nova_compute[186176]: 2026-02-16 18:03:27.693 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:29 compute-0 podman[195505]: time="2026-02-16T18:03:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:03:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:03:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:03:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:03:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 18:03:31 compute-0 openstack_network_exporter[198360]: ERROR   18:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:03:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:03:31 compute-0 openstack_network_exporter[198360]: ERROR   18:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:03:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:03:32 compute-0 nova_compute[186176]: 2026-02-16 18:03:32.694 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:32 compute-0 nova_compute[186176]: 2026-02-16 18:03:32.696 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:32 compute-0 nova_compute[186176]: 2026-02-16 18:03:32.697 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:03:32 compute-0 nova_compute[186176]: 2026-02-16 18:03:32.697 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:32 compute-0 nova_compute[186176]: 2026-02-16 18:03:32.739 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:32 compute-0 nova_compute[186176]: 2026-02-16 18:03:32.740 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:33 compute-0 podman[222027]: 2026-02-16 18:03:33.134198961 +0000 UTC m=+0.103013800 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1770267347, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7)
Feb 16 18:03:36 compute-0 podman[222048]: 2026-02-16 18:03:36.113001048 +0000 UTC m=+0.077395352 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 16 18:03:37 compute-0 nova_compute[186176]: 2026-02-16 18:03:37.741 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:37 compute-0 nova_compute[186176]: 2026-02-16 18:03:37.743 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:37 compute-0 nova_compute[186176]: 2026-02-16 18:03:37.743 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:03:37 compute-0 nova_compute[186176]: 2026-02-16 18:03:37.743 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:37 compute-0 nova_compute[186176]: 2026-02-16 18:03:37.769 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:37 compute-0 nova_compute[186176]: 2026-02-16 18:03:37.769 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:03:38.205 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:03:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:03:38.206 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:03:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:03:38.206 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:03:42 compute-0 nova_compute[186176]: 2026-02-16 18:03:42.771 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:42 compute-0 nova_compute[186176]: 2026-02-16 18:03:42.773 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:42 compute-0 nova_compute[186176]: 2026-02-16 18:03:42.773 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:03:42 compute-0 nova_compute[186176]: 2026-02-16 18:03:42.773 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:42 compute-0 nova_compute[186176]: 2026-02-16 18:03:42.809 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:42 compute-0 nova_compute[186176]: 2026-02-16 18:03:42.809 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:43 compute-0 podman[222069]: 2026-02-16 18:03:43.118189857 +0000 UTC m=+0.074589343 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 18:03:43 compute-0 podman[222068]: 2026-02-16 18:03:43.146634126 +0000 UTC m=+0.108291351 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 16 18:03:47 compute-0 nova_compute[186176]: 2026-02-16 18:03:47.810 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:47 compute-0 nova_compute[186176]: 2026-02-16 18:03:47.812 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:47 compute-0 nova_compute[186176]: 2026-02-16 18:03:47.812 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:03:47 compute-0 nova_compute[186176]: 2026-02-16 18:03:47.813 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:47 compute-0 nova_compute[186176]: 2026-02-16 18:03:47.831 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:47 compute-0 nova_compute[186176]: 2026-02-16 18:03:47.831 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:52 compute-0 nova_compute[186176]: 2026-02-16 18:03:52.832 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:52 compute-0 nova_compute[186176]: 2026-02-16 18:03:52.834 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:52 compute-0 nova_compute[186176]: 2026-02-16 18:03:52.835 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:03:52 compute-0 nova_compute[186176]: 2026-02-16 18:03:52.835 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:52 compute-0 nova_compute[186176]: 2026-02-16 18:03:52.883 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:52 compute-0 nova_compute[186176]: 2026-02-16 18:03:52.884 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:57 compute-0 nova_compute[186176]: 2026-02-16 18:03:57.885 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:57 compute-0 nova_compute[186176]: 2026-02-16 18:03:57.887 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:03:57 compute-0 nova_compute[186176]: 2026-02-16 18:03:57.887 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:03:57 compute-0 nova_compute[186176]: 2026-02-16 18:03:57.888 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:57 compute-0 nova_compute[186176]: 2026-02-16 18:03:57.925 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:03:57 compute-0 nova_compute[186176]: 2026-02-16 18:03:57.926 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:03:59 compute-0 podman[195505]: time="2026-02-16T18:03:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:03:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:03:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:03:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:03:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 18:04:01 compute-0 openstack_network_exporter[198360]: ERROR   18:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:04:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:04:01 compute-0 openstack_network_exporter[198360]: ERROR   18:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:04:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:04:02 compute-0 nova_compute[186176]: 2026-02-16 18:04:02.927 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:02 compute-0 nova_compute[186176]: 2026-02-16 18:04:02.929 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:02 compute-0 nova_compute[186176]: 2026-02-16 18:04:02.930 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:04:02 compute-0 nova_compute[186176]: 2026-02-16 18:04:02.930 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:04:02 compute-0 nova_compute[186176]: 2026-02-16 18:04:02.952 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:04:02 compute-0 nova_compute[186176]: 2026-02-16 18:04:02.953 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:04:04 compute-0 podman[222119]: 2026-02-16 18:04:04.114141175 +0000 UTC m=+0.082683812 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, release=1770267347, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.7)
Feb 16 18:04:07 compute-0 podman[222139]: 2026-02-16 18:04:07.090223094 +0000 UTC m=+0.062879795 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 16 18:04:07 compute-0 nova_compute[186176]: 2026-02-16 18:04:07.954 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:08 compute-0 nova_compute[186176]: 2026-02-16 18:04:08.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:04:08 compute-0 nova_compute[186176]: 2026-02-16 18:04:08.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 18:04:08 compute-0 nova_compute[186176]: 2026-02-16 18:04:08.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 18:04:08 compute-0 nova_compute[186176]: 2026-02-16 18:04:08.504 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.339 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.340 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.340 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.341 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.548 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.550 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5814MB free_disk=73.22232437133789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.551 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.551 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.658 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.659 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.823 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.852 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.855 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 18:04:10 compute-0 nova_compute[186176]: 2026-02-16 18:04:10.855 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:04:11 compute-0 nova_compute[186176]: 2026-02-16 18:04:11.857 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:04:12 compute-0 nova_compute[186176]: 2026-02-16 18:04:12.958 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:14 compute-0 podman[222160]: 2026-02-16 18:04:14.116438801 +0000 UTC m=+0.075462994 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 18:04:14 compute-0 podman[222159]: 2026-02-16 18:04:14.177812158 +0000 UTC m=+0.143795252 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 18:04:14 compute-0 nova_compute[186176]: 2026-02-16 18:04:14.311 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:04:14 compute-0 nova_compute[186176]: 2026-02-16 18:04:14.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:04:17 compute-0 nova_compute[186176]: 2026-02-16 18:04:17.961 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:18 compute-0 nova_compute[186176]: 2026-02-16 18:04:18.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:04:20 compute-0 nova_compute[186176]: 2026-02-16 18:04:20.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:04:22 compute-0 nova_compute[186176]: 2026-02-16 18:04:22.963 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:04:27 compute-0 nova_compute[186176]: 2026-02-16 18:04:27.965 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:27 compute-0 nova_compute[186176]: 2026-02-16 18:04:27.966 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:04:27 compute-0 nova_compute[186176]: 2026-02-16 18:04:27.966 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:04:27 compute-0 nova_compute[186176]: 2026-02-16 18:04:27.967 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:04:27 compute-0 nova_compute[186176]: 2026-02-16 18:04:27.967 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:04:27 compute-0 nova_compute[186176]: 2026-02-16 18:04:27.968 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:04:29 compute-0 podman[195505]: time="2026-02-16T18:04:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:04:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:04:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:04:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:04:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 18:04:31 compute-0 openstack_network_exporter[198360]: ERROR   18:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:04:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:04:31 compute-0 openstack_network_exporter[198360]: ERROR   18:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:04:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:04:32 compute-0 nova_compute[186176]: 2026-02-16 18:04:32.969 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:04:35 compute-0 podman[222207]: 2026-02-16 18:04:35.12708761 +0000 UTC m=+0.093535778 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_id=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 16 18:04:37 compute-0 nova_compute[186176]: 2026-02-16 18:04:37.971 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:04:38 compute-0 podman[222229]: 2026-02-16 18:04:38.120282148 +0000 UTC m=+0.095642290 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 16 18:04:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:04:38.207 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:04:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:04:38.208 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:04:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:04:38.208 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:04:42 compute-0 nova_compute[186176]: 2026-02-16 18:04:42.973 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:04:45 compute-0 podman[222250]: 2026-02-16 18:04:45.110348627 +0000 UTC m=+0.075600467 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 16 18:04:45 compute-0 podman[222249]: 2026-02-16 18:04:45.148291349 +0000 UTC m=+0.117776233 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 16 18:04:47 compute-0 nova_compute[186176]: 2026-02-16 18:04:47.977 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:47 compute-0 nova_compute[186176]: 2026-02-16 18:04:47.979 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:47 compute-0 nova_compute[186176]: 2026-02-16 18:04:47.980 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:04:47 compute-0 nova_compute[186176]: 2026-02-16 18:04:47.980 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:04:48 compute-0 nova_compute[186176]: 2026-02-16 18:04:48.017 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:04:48 compute-0 nova_compute[186176]: 2026-02-16 18:04:48.018 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:04:53 compute-0 nova_compute[186176]: 2026-02-16 18:04:53.019 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:53 compute-0 nova_compute[186176]: 2026-02-16 18:04:53.021 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:53 compute-0 nova_compute[186176]: 2026-02-16 18:04:53.021 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:04:53 compute-0 nova_compute[186176]: 2026-02-16 18:04:53.021 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:04:53 compute-0 nova_compute[186176]: 2026-02-16 18:04:53.062 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:04:53 compute-0 nova_compute[186176]: 2026-02-16 18:04:53.062 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:04:58 compute-0 nova_compute[186176]: 2026-02-16 18:04:58.063 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:58 compute-0 nova_compute[186176]: 2026-02-16 18:04:58.065 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:04:58 compute-0 nova_compute[186176]: 2026-02-16 18:04:58.065 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:04:58 compute-0 nova_compute[186176]: 2026-02-16 18:04:58.066 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:04:58 compute-0 nova_compute[186176]: 2026-02-16 18:04:58.112 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:04:58 compute-0 nova_compute[186176]: 2026-02-16 18:04:58.113 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:04:59 compute-0 podman[195505]: time="2026-02-16T18:04:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:04:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:04:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:04:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:04:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2178 "" "Go-http-client/1.1"
Feb 16 18:05:01 compute-0 openstack_network_exporter[198360]: ERROR   18:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:05:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:05:01 compute-0 openstack_network_exporter[198360]: ERROR   18:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:05:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:05:03 compute-0 nova_compute[186176]: 2026-02-16 18:05:03.114 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:03 compute-0 nova_compute[186176]: 2026-02-16 18:05:03.116 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:03 compute-0 nova_compute[186176]: 2026-02-16 18:05:03.117 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:03 compute-0 nova_compute[186176]: 2026-02-16 18:05:03.117 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:03 compute-0 nova_compute[186176]: 2026-02-16 18:05:03.154 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:03 compute-0 nova_compute[186176]: 2026-02-16 18:05:03.154 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:06 compute-0 podman[222299]: 2026-02-16 18:05:06.099542219 +0000 UTC m=+0.069591730 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, version=9.7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Feb 16 18:05:08 compute-0 nova_compute[186176]: 2026-02-16 18:05:08.155 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:08 compute-0 nova_compute[186176]: 2026-02-16 18:05:08.158 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:08 compute-0 nova_compute[186176]: 2026-02-16 18:05:08.158 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:08 compute-0 nova_compute[186176]: 2026-02-16 18:05:08.158 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:08 compute-0 nova_compute[186176]: 2026-02-16 18:05:08.199 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:08 compute-0 nova_compute[186176]: 2026-02-16 18:05:08.200 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:09 compute-0 podman[222321]: 2026-02-16 18:05:09.100237342 +0000 UTC m=+0.077130385 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 16 18:05:09 compute-0 nova_compute[186176]: 2026-02-16 18:05:09.312 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:05:09 compute-0 nova_compute[186176]: 2026-02-16 18:05:09.333 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:05:09 compute-0 nova_compute[186176]: 2026-02-16 18:05:09.333 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 18:05:09 compute-0 nova_compute[186176]: 2026-02-16 18:05:09.334 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 18:05:09 compute-0 nova_compute[186176]: 2026-02-16 18:05:09.349 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.353 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.354 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.355 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.355 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.578 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.580 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5821MB free_disk=73.22232437133789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.580 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.581 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.781 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.782 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.809 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing inventories for resource provider bb904aac-529f-46ef-9861-9c655a4b383c _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.843 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating ProviderTree inventory for provider bb904aac-529f-46ef-9861-9c655a4b383c from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.843 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Updating inventory in ProviderTree for provider bb904aac-529f-46ef-9861-9c655a4b383c with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.869 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing aggregate associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.905 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Refreshing trait associations for resource provider bb904aac-529f-46ef-9861-9c655a4b383c, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_EXTEND,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_RESCUE_BFV,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE42,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_VMXNET3 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.928 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.962 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.965 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 18:05:10 compute-0 nova_compute[186176]: 2026-02-16 18:05:10.965 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.385s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:05:11 compute-0 nova_compute[186176]: 2026-02-16 18:05:11.967 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:05:11 compute-0 nova_compute[186176]: 2026-02-16 18:05:11.967 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:05:11 compute-0 nova_compute[186176]: 2026-02-16 18:05:11.967 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 18:05:13 compute-0 nova_compute[186176]: 2026-02-16 18:05:13.201 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:13 compute-0 nova_compute[186176]: 2026-02-16 18:05:13.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:05:14 compute-0 nova_compute[186176]: 2026-02-16 18:05:14.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:05:15 compute-0 nova_compute[186176]: 2026-02-16 18:05:15.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:05:16 compute-0 podman[222344]: 2026-02-16 18:05:16.116991056 +0000 UTC m=+0.084639349 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 18:05:16 compute-0 podman[222343]: 2026-02-16 18:05:16.142321358 +0000 UTC m=+0.111408046 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 16 18:05:18 compute-0 nova_compute[186176]: 2026-02-16 18:05:18.203 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:18 compute-0 nova_compute[186176]: 2026-02-16 18:05:18.205 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:18 compute-0 nova_compute[186176]: 2026-02-16 18:05:18.205 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:18 compute-0 nova_compute[186176]: 2026-02-16 18:05:18.205 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:18 compute-0 nova_compute[186176]: 2026-02-16 18:05:18.250 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:18 compute-0 nova_compute[186176]: 2026-02-16 18:05:18.251 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:18 compute-0 nova_compute[186176]: 2026-02-16 18:05:18.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:05:21 compute-0 nova_compute[186176]: 2026-02-16 18:05:21.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:05:23 compute-0 nova_compute[186176]: 2026-02-16 18:05:23.252 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:23 compute-0 nova_compute[186176]: 2026-02-16 18:05:23.254 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:23 compute-0 nova_compute[186176]: 2026-02-16 18:05:23.254 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:23 compute-0 nova_compute[186176]: 2026-02-16 18:05:23.255 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:23 compute-0 nova_compute[186176]: 2026-02-16 18:05:23.288 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:23 compute-0 nova_compute[186176]: 2026-02-16 18:05:23.289 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:28 compute-0 nova_compute[186176]: 2026-02-16 18:05:28.290 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:28 compute-0 nova_compute[186176]: 2026-02-16 18:05:28.292 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:28 compute-0 nova_compute[186176]: 2026-02-16 18:05:28.293 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:28 compute-0 nova_compute[186176]: 2026-02-16 18:05:28.293 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:28 compute-0 nova_compute[186176]: 2026-02-16 18:05:28.323 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:28 compute-0 nova_compute[186176]: 2026-02-16 18:05:28.323 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:29 compute-0 podman[195505]: time="2026-02-16T18:05:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:05:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:05:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:05:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:05:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2184 "" "Go-http-client/1.1"
Feb 16 18:05:31 compute-0 openstack_network_exporter[198360]: ERROR   18:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:05:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:05:31 compute-0 openstack_network_exporter[198360]: ERROR   18:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:05:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:05:31 compute-0 sshd-session[222395]: Invalid user solana from 2.57.122.210 port 46078
Feb 16 18:05:31 compute-0 sshd-session[222395]: Connection closed by invalid user solana 2.57.122.210 port 46078 [preauth]
Feb 16 18:05:33 compute-0 nova_compute[186176]: 2026-02-16 18:05:33.324 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:33 compute-0 nova_compute[186176]: 2026-02-16 18:05:33.326 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:33 compute-0 nova_compute[186176]: 2026-02-16 18:05:33.326 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:33 compute-0 nova_compute[186176]: 2026-02-16 18:05:33.326 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:33 compute-0 nova_compute[186176]: 2026-02-16 18:05:33.366 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:33 compute-0 nova_compute[186176]: 2026-02-16 18:05:33.367 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:37 compute-0 podman[222397]: 2026-02-16 18:05:37.08308024 +0000 UTC m=+0.057660437 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Feb 16 18:05:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:05:38.208 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:05:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:05:38.209 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:05:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:05:38.209 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:05:38 compute-0 nova_compute[186176]: 2026-02-16 18:05:38.368 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:38 compute-0 nova_compute[186176]: 2026-02-16 18:05:38.370 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:38 compute-0 nova_compute[186176]: 2026-02-16 18:05:38.370 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:38 compute-0 nova_compute[186176]: 2026-02-16 18:05:38.370 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:38 compute-0 nova_compute[186176]: 2026-02-16 18:05:38.396 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:38 compute-0 nova_compute[186176]: 2026-02-16 18:05:38.396 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:40 compute-0 podman[222418]: 2026-02-16 18:05:40.071542143 +0000 UTC m=+0.045963620 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 16 18:05:43 compute-0 nova_compute[186176]: 2026-02-16 18:05:43.397 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:43 compute-0 nova_compute[186176]: 2026-02-16 18:05:43.399 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:43 compute-0 nova_compute[186176]: 2026-02-16 18:05:43.399 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:43 compute-0 nova_compute[186176]: 2026-02-16 18:05:43.399 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:43 compute-0 nova_compute[186176]: 2026-02-16 18:05:43.429 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:43 compute-0 nova_compute[186176]: 2026-02-16 18:05:43.429 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:47 compute-0 podman[222438]: 2026-02-16 18:05:47.097552417 +0000 UTC m=+0.062030821 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 18:05:47 compute-0 podman[222437]: 2026-02-16 18:05:47.118376315 +0000 UTC m=+0.087610475 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 18:05:48 compute-0 nova_compute[186176]: 2026-02-16 18:05:48.431 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:48 compute-0 nova_compute[186176]: 2026-02-16 18:05:48.432 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:48 compute-0 nova_compute[186176]: 2026-02-16 18:05:48.433 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:48 compute-0 nova_compute[186176]: 2026-02-16 18:05:48.433 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:48 compute-0 nova_compute[186176]: 2026-02-16 18:05:48.453 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:48 compute-0 nova_compute[186176]: 2026-02-16 18:05:48.454 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:53 compute-0 nova_compute[186176]: 2026-02-16 18:05:53.454 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:53 compute-0 nova_compute[186176]: 2026-02-16 18:05:53.456 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:53 compute-0 nova_compute[186176]: 2026-02-16 18:05:53.457 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:53 compute-0 nova_compute[186176]: 2026-02-16 18:05:53.457 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:53 compute-0 nova_compute[186176]: 2026-02-16 18:05:53.502 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:53 compute-0 nova_compute[186176]: 2026-02-16 18:05:53.502 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:58 compute-0 nova_compute[186176]: 2026-02-16 18:05:58.503 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:58 compute-0 nova_compute[186176]: 2026-02-16 18:05:58.506 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:05:58 compute-0 nova_compute[186176]: 2026-02-16 18:05:58.507 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:05:58 compute-0 nova_compute[186176]: 2026-02-16 18:05:58.507 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:58 compute-0 nova_compute[186176]: 2026-02-16 18:05:58.555 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:05:58 compute-0 nova_compute[186176]: 2026-02-16 18:05:58.557 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:05:59 compute-0 podman[195505]: time="2026-02-16T18:05:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:05:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:05:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:05:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:05:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2179 "" "Go-http-client/1.1"
Feb 16 18:06:01 compute-0 openstack_network_exporter[198360]: ERROR   18:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:06:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:06:01 compute-0 openstack_network_exporter[198360]: ERROR   18:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:06:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:06:03 compute-0 nova_compute[186176]: 2026-02-16 18:06:03.557 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:03 compute-0 nova_compute[186176]: 2026-02-16 18:06:03.559 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:03 compute-0 nova_compute[186176]: 2026-02-16 18:06:03.560 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:03 compute-0 nova_compute[186176]: 2026-02-16 18:06:03.560 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:03 compute-0 nova_compute[186176]: 2026-02-16 18:06:03.592 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:03 compute-0 nova_compute[186176]: 2026-02-16 18:06:03.593 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:08 compute-0 podman[222488]: 2026-02-16 18:06:08.10301441 +0000 UTC m=+0.067788062 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 16 18:06:08 compute-0 nova_compute[186176]: 2026-02-16 18:06:08.595 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:08 compute-0 nova_compute[186176]: 2026-02-16 18:06:08.596 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:08 compute-0 nova_compute[186176]: 2026-02-16 18:06:08.597 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:08 compute-0 nova_compute[186176]: 2026-02-16 18:06:08.597 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:08 compute-0 nova_compute[186176]: 2026-02-16 18:06:08.651 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:08 compute-0 nova_compute[186176]: 2026-02-16 18:06:08.651 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:10 compute-0 nova_compute[186176]: 2026-02-16 18:06:10.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:06:10 compute-0 nova_compute[186176]: 2026-02-16 18:06:10.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 18:06:10 compute-0 nova_compute[186176]: 2026-02-16 18:06:10.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 18:06:10 compute-0 nova_compute[186176]: 2026-02-16 18:06:10.344 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 18:06:11 compute-0 podman[222509]: 2026-02-16 18:06:11.119291478 +0000 UTC m=+0.080186504 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.338 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.339 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.339 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.339 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.513 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.514 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5831MB free_disk=73.22232437133789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.514 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.515 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.586 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.587 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.606 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.624 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.627 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 18:06:11 compute-0 nova_compute[186176]: 2026-02-16 18:06:11.627 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:06:12 compute-0 nova_compute[186176]: 2026-02-16 18:06:12.628 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:06:12 compute-0 nova_compute[186176]: 2026-02-16 18:06:12.629 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 18:06:13 compute-0 nova_compute[186176]: 2026-02-16 18:06:13.652 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:13 compute-0 nova_compute[186176]: 2026-02-16 18:06:13.654 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:13 compute-0 nova_compute[186176]: 2026-02-16 18:06:13.655 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:13 compute-0 nova_compute[186176]: 2026-02-16 18:06:13.655 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:13 compute-0 nova_compute[186176]: 2026-02-16 18:06:13.691 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:13 compute-0 nova_compute[186176]: 2026-02-16 18:06:13.692 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:15 compute-0 nova_compute[186176]: 2026-02-16 18:06:15.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:06:15 compute-0 nova_compute[186176]: 2026-02-16 18:06:15.318 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:06:16 compute-0 nova_compute[186176]: 2026-02-16 18:06:16.313 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:06:18 compute-0 podman[222529]: 2026-02-16 18:06:18.105807583 +0000 UTC m=+0.070416266 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 16 18:06:18 compute-0 podman[222528]: 2026-02-16 18:06:18.1229292 +0000 UTC m=+0.098421568 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 16 18:06:18 compute-0 nova_compute[186176]: 2026-02-16 18:06:18.692 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:18 compute-0 nova_compute[186176]: 2026-02-16 18:06:18.695 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:19 compute-0 nova_compute[186176]: 2026-02-16 18:06:19.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:06:21 compute-0 nova_compute[186176]: 2026-02-16 18:06:21.319 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:06:23 compute-0 nova_compute[186176]: 2026-02-16 18:06:23.694 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:23 compute-0 nova_compute[186176]: 2026-02-16 18:06:23.697 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:23 compute-0 nova_compute[186176]: 2026-02-16 18:06:23.697 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:23 compute-0 nova_compute[186176]: 2026-02-16 18:06:23.698 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:23 compute-0 nova_compute[186176]: 2026-02-16 18:06:23.736 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:23 compute-0 nova_compute[186176]: 2026-02-16 18:06:23.737 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:28 compute-0 nova_compute[186176]: 2026-02-16 18:06:28.737 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:28 compute-0 nova_compute[186176]: 2026-02-16 18:06:28.739 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:28 compute-0 nova_compute[186176]: 2026-02-16 18:06:28.740 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:28 compute-0 nova_compute[186176]: 2026-02-16 18:06:28.740 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:28 compute-0 nova_compute[186176]: 2026-02-16 18:06:28.775 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:28 compute-0 nova_compute[186176]: 2026-02-16 18:06:28.776 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:29 compute-0 podman[195505]: time="2026-02-16T18:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:06:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:06:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2180 "" "Go-http-client/1.1"
Feb 16 18:06:31 compute-0 openstack_network_exporter[198360]: ERROR   18:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:06:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:06:31 compute-0 openstack_network_exporter[198360]: ERROR   18:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:06:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:06:33 compute-0 nova_compute[186176]: 2026-02-16 18:06:33.776 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:33 compute-0 nova_compute[186176]: 2026-02-16 18:06:33.778 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:33 compute-0 nova_compute[186176]: 2026-02-16 18:06:33.779 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:33 compute-0 nova_compute[186176]: 2026-02-16 18:06:33.779 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:33 compute-0 nova_compute[186176]: 2026-02-16 18:06:33.813 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:33 compute-0 nova_compute[186176]: 2026-02-16 18:06:33.814 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:06:38.210 105730 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:06:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:06:38.211 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:06:38 compute-0 ovn_metadata_agent[105725]: 2026-02-16 18:06:38.211 105730 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:06:38 compute-0 nova_compute[186176]: 2026-02-16 18:06:38.815 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:38 compute-0 nova_compute[186176]: 2026-02-16 18:06:38.817 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:38 compute-0 nova_compute[186176]: 2026-02-16 18:06:38.817 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:38 compute-0 nova_compute[186176]: 2026-02-16 18:06:38.817 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:38 compute-0 nova_compute[186176]: 2026-02-16 18:06:38.852 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:38 compute-0 nova_compute[186176]: 2026-02-16 18:06:38.853 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:39 compute-0 podman[222581]: 2026-02-16 18:06:39.100104672 +0000 UTC m=+0.068446208 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, release=1770267347, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9)
Feb 16 18:06:42 compute-0 podman[222602]: 2026-02-16 18:06:42.107088733 +0000 UTC m=+0.068967511 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 16 18:06:43 compute-0 nova_compute[186176]: 2026-02-16 18:06:43.854 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:43 compute-0 nova_compute[186176]: 2026-02-16 18:06:43.855 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:43 compute-0 nova_compute[186176]: 2026-02-16 18:06:43.856 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:43 compute-0 nova_compute[186176]: 2026-02-16 18:06:43.856 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:43 compute-0 nova_compute[186176]: 2026-02-16 18:06:43.862 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:43 compute-0 nova_compute[186176]: 2026-02-16 18:06:43.862 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:48 compute-0 nova_compute[186176]: 2026-02-16 18:06:48.863 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:48 compute-0 nova_compute[186176]: 2026-02-16 18:06:48.865 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:48 compute-0 nova_compute[186176]: 2026-02-16 18:06:48.866 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:48 compute-0 nova_compute[186176]: 2026-02-16 18:06:48.866 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:48 compute-0 nova_compute[186176]: 2026-02-16 18:06:48.913 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:48 compute-0 nova_compute[186176]: 2026-02-16 18:06:48.914 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:49 compute-0 podman[222622]: 2026-02-16 18:06:49.083612564 +0000 UTC m=+0.057140183 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 16 18:06:49 compute-0 podman[222621]: 2026-02-16 18:06:49.137777114 +0000 UTC m=+0.109757305 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 16 18:06:53 compute-0 nova_compute[186176]: 2026-02-16 18:06:53.915 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:53 compute-0 nova_compute[186176]: 2026-02-16 18:06:53.917 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:53 compute-0 nova_compute[186176]: 2026-02-16 18:06:53.917 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:53 compute-0 nova_compute[186176]: 2026-02-16 18:06:53.917 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:53 compute-0 nova_compute[186176]: 2026-02-16 18:06:53.953 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:53 compute-0 nova_compute[186176]: 2026-02-16 18:06:53.954 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:58 compute-0 nova_compute[186176]: 2026-02-16 18:06:58.955 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:58 compute-0 nova_compute[186176]: 2026-02-16 18:06:58.957 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:06:58 compute-0 nova_compute[186176]: 2026-02-16 18:06:58.958 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:06:58 compute-0 nova_compute[186176]: 2026-02-16 18:06:58.958 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:59 compute-0 nova_compute[186176]: 2026-02-16 18:06:59.014 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:06:59 compute-0 nova_compute[186176]: 2026-02-16 18:06:59.015 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:06:59 compute-0 podman[195505]: time="2026-02-16T18:06:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:06:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:06:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:06:59 compute-0 podman[195505]: @ - - [16/Feb/2026:18:06:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2181 "" "Go-http-client/1.1"
Feb 16 18:07:01 compute-0 openstack_network_exporter[198360]: ERROR   18:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:07:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:07:01 compute-0 openstack_network_exporter[198360]: ERROR   18:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:07:01 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:07:04 compute-0 nova_compute[186176]: 2026-02-16 18:07:04.016 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:04 compute-0 nova_compute[186176]: 2026-02-16 18:07:04.018 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:04 compute-0 nova_compute[186176]: 2026-02-16 18:07:04.018 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:07:04 compute-0 nova_compute[186176]: 2026-02-16 18:07:04.018 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:04 compute-0 nova_compute[186176]: 2026-02-16 18:07:04.052 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:07:04 compute-0 nova_compute[186176]: 2026-02-16 18:07:04.052 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:06 compute-0 nova_compute[186176]: 2026-02-16 18:07:06.177 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:09 compute-0 nova_compute[186176]: 2026-02-16 18:07:09.096 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:09 compute-0 nova_compute[186176]: 2026-02-16 18:07:09.097 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:07:09 compute-0 nova_compute[186176]: 2026-02-16 18:07:09.098 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:07:09 compute-0 nova_compute[186176]: 2026-02-16 18:07:09.098 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:09 compute-0 nova_compute[186176]: 2026-02-16 18:07:09.098 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:09 compute-0 nova_compute[186176]: 2026-02-16 18:07:09.100 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:07:10 compute-0 podman[222671]: 2026-02-16 18:07:10.106261317 +0000 UTC m=+0.080299517 container health_status 9b060b47d8cd8ffc58675182a25f7ea77888e374edf455ddae868b0b8ed206a4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, name=ubi9/ubi-minimal, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.352 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.353 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.402 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.403 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.404 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.404 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.561 186180 WARNING nova.virt.libvirt.driver [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.562 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5839MB free_disk=73.22254943847656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.563 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.563 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.633 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.633 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.660 186180 DEBUG nova.compute.provider_tree [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed in ProviderTree for provider: bb904aac-529f-46ef-9861-9c655a4b383c update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.678 186180 DEBUG nova.scheduler.client.report [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Inventory has not changed for provider bb904aac-529f-46ef-9861-9c655a4b383c based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.680 186180 DEBUG nova.compute.resource_tracker [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 16 18:07:12 compute-0 nova_compute[186176]: 2026-02-16 18:07:12.680 186180 DEBUG oslo_concurrency.lockutils [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 16 18:07:13 compute-0 podman[222692]: 2026-02-16 18:07:13.131939714 +0000 UTC m=+0.099409993 container health_status 216ad1d749e6aefed73d23988973b3b8d21e9711b63e2c5bf4f46deadaf9f907 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 16 18:07:13 compute-0 nova_compute[186176]: 2026-02-16 18:07:13.644 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:13 compute-0 nova_compute[186176]: 2026-02-16 18:07:13.674 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:14 compute-0 nova_compute[186176]: 2026-02-16 18:07:14.100 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:14 compute-0 nova_compute[186176]: 2026-02-16 18:07:14.103 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:14 compute-0 nova_compute[186176]: 2026-02-16 18:07:14.104 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:07:14 compute-0 nova_compute[186176]: 2026-02-16 18:07:14.104 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:14 compute-0 nova_compute[186176]: 2026-02-16 18:07:14.138 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:07:14 compute-0 nova_compute[186176]: 2026-02-16 18:07:14.139 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:14 compute-0 nova_compute[186176]: 2026-02-16 18:07:14.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:14 compute-0 nova_compute[186176]: 2026-02-16 18:07:14.317 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 16 18:07:15 compute-0 nova_compute[186176]: 2026-02-16 18:07:15.319 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:16 compute-0 nova_compute[186176]: 2026-02-16 18:07:16.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:17 compute-0 nova_compute[186176]: 2026-02-16 18:07:17.314 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:19 compute-0 nova_compute[186176]: 2026-02-16 18:07:19.140 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:19 compute-0 nova_compute[186176]: 2026-02-16 18:07:19.142 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:19 compute-0 nova_compute[186176]: 2026-02-16 18:07:19.142 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:07:19 compute-0 nova_compute[186176]: 2026-02-16 18:07:19.143 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:19 compute-0 nova_compute[186176]: 2026-02-16 18:07:19.170 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:07:19 compute-0 nova_compute[186176]: 2026-02-16 18:07:19.171 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:19 compute-0 nova_compute[186176]: 2026-02-16 18:07:19.316 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:20 compute-0 podman[222713]: 2026-02-16 18:07:20.133289021 +0000 UTC m=+0.095289221 container health_status a2f6c7ff2890bfe2292ae19bf121bd81a4f1addbf78ebf060395c5cf6089fd2e (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 16 18:07:20 compute-0 podman[222712]: 2026-02-16 18:07:20.133872775 +0000 UTC m=+0.101011931 container health_status 6a18c5c5f9ddbca9b7735ca96f5f7307a41866728ffd79051a82638b01788e73 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '47fe5fffdd248608ce716dda65aeeedaa9017a62644d187ab1eb379cba3e1021-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d-1de1a8df86f591245971e6efe2d1ca5550d18ae1ff9cc6f4069ece5943ab586d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 16 18:07:22 compute-0 nova_compute[186176]: 2026-02-16 18:07:22.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:23 compute-0 nova_compute[186176]: 2026-02-16 18:07:23.336 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:24 compute-0 nova_compute[186176]: 2026-02-16 18:07:24.224 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:24 compute-0 nova_compute[186176]: 2026-02-16 18:07:24.225 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:07:24 compute-0 nova_compute[186176]: 2026-02-16 18:07:24.225 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5054 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:07:24 compute-0 nova_compute[186176]: 2026-02-16 18:07:24.225 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:24 compute-0 nova_compute[186176]: 2026-02-16 18:07:24.226 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:24 compute-0 nova_compute[186176]: 2026-02-16 18:07:24.227 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:07:25 compute-0 sshd-session[222762]: Accepted publickey for zuul from 192.168.122.10 port 41776 ssh2: ECDSA SHA256:q7HzukJ1UTOVUoYACW9oq0aMm7uX5Qh8e8uWlj1xf2I
Feb 16 18:07:25 compute-0 systemd-logind[821]: New session 50 of user zuul.
Feb 16 18:07:25 compute-0 systemd[1]: Started Session 50 of User zuul.
Feb 16 18:07:25 compute-0 sshd-session[222762]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 16 18:07:25 compute-0 sudo[222766]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 16 18:07:25 compute-0 sudo[222766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 16 18:07:29 compute-0 nova_compute[186176]: 2026-02-16 18:07:29.227 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:29 compute-0 nova_compute[186176]: 2026-02-16 18:07:29.230 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:29 compute-0 nova_compute[186176]: 2026-02-16 18:07:29.230 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:07:29 compute-0 nova_compute[186176]: 2026-02-16 18:07:29.230 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:29 compute-0 nova_compute[186176]: 2026-02-16 18:07:29.274 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:07:29 compute-0 nova_compute[186176]: 2026-02-16 18:07:29.274 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:29 compute-0 ovs-vsctl[222937]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 16 18:07:29 compute-0 podman[195505]: time="2026-02-16T18:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 16 18:07:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 16012 "" "Go-http-client/1.1"
Feb 16 18:07:29 compute-0 podman[195505]: @ - - [16/Feb/2026:18:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2182 "" "Go-http-client/1.1"
Feb 16 18:07:30 compute-0 virtqemud[185389]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 16 18:07:30 compute-0 virtqemud[185389]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 16 18:07:30 compute-0 virtqemud[185389]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 16 18:07:30 compute-0 nova_compute[186176]: 2026-02-16 18:07:30.317 186180 DEBUG oslo_service.periodic_task [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 16 18:07:30 compute-0 nova_compute[186176]: 2026-02-16 18:07:30.318 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 16 18:07:30 compute-0 nova_compute[186176]: 2026-02-16 18:07:30.501 186180 DEBUG nova.compute.manager [None req-174fa9fc-754d-49bf-b5eb-8a465a3544fc - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 16 18:07:31 compute-0 crontab[223334]: (root) LIST (root)
Feb 16 18:07:31 compute-0 openstack_network_exporter[198360]: ERROR   18:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 16 18:07:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:07:31 compute-0 openstack_network_exporter[198360]: ERROR   18:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 16 18:07:31 compute-0 openstack_network_exporter[198360]: 
Feb 16 18:07:32 compute-0 systemd[1]: Starting Hostname Service...
Feb 16 18:07:32 compute-0 systemd[1]: Started Hostname Service.
Feb 16 18:07:34 compute-0 nova_compute[186176]: 2026-02-16 18:07:34.275 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:34 compute-0 nova_compute[186176]: 2026-02-16 18:07:34.278 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 16 18:07:34 compute-0 nova_compute[186176]: 2026-02-16 18:07:34.279 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 16 18:07:34 compute-0 nova_compute[186176]: 2026-02-16 18:07:34.279 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 16 18:07:34 compute-0 nova_compute[186176]: 2026-02-16 18:07:34.308 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 16 18:07:34 compute-0 nova_compute[186176]: 2026-02-16 18:07:34.309 186180 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
